
We've defined the basis of a vector space, its dimension, and the ideas of linear independence and linear combinations. Then we've gone on to find the magnitude or modulus of a vector, and the dot scalar and vector projection product. Then we've defined vector addition and scaling a vector by a number, making it bigger or reversing its direction. We've looked at vectors as being objects that describe where we are in space which could be a physical space, a space of data, or a parameter space of the parameters of a function. Finally, we discussed what that means to map vectors from one space to another and how that is going to be useful in data science and machine learning. We found a test for independence: vectors are independent if one of them is not a linear combination of the others. In this section, we've talked about the dimensionality of a vector space in terms of the number of independent basis factors that it has. For the lazy, jumpy straight to this video. It might be worth it to watch the first three videos of the Essence of Linear Algebra series. Therefore, any mapping that we do from one set of basis vectors, (one coordinate system) to another set of basis vectors (another coordinate system), keeps the vector space being a regularly spaced grid, where our original rules of vector addition and multiplication by a scaler still work. The axes of the original grid are projected onto the new grid and potentially have different values on that new grid, but crucially, the projection keeps the grid being evenly spaced. Now, let's think about what happens when we map from one basis to another. So if at all possible, you want to construct what's called an orthonormal basic vector set, where all vectors of the set are at 90 degrees to each other and are all of unit length.

they don't have to be orthogonal (or normal) to each otherīut, as it turns out, everything is going to be much easier if they are.


If we are thinking of a vector as representing a physical quantity like acceleration or velocity, we may interpret the norm as the magnitude of this quantity (how " large" it is, regardless of its direction).īy Pythagorus's Theorem, \vert r \vert = \sqrt don't have to be. If we are thinking of a vector as representing the line segment from the origin to a given point (i.e., the geometric interpretation), we may interpret the norm as the length of this line segment. The length, magnitude, modulus and norm of a vector are all the same thing, and just represent a difference in terminology.
