Linear Algebra: Linear Algebra
Dot product
Defnition A vanilla vector space does not have any other operations that involve two vectors. However, a vector space can be equipped with an inner product to form an inner product space. An inner product or dot product between vectors and is defined as follows:
Norm of a vector To see the benefit and the interpretation of the dot product, let’s take a closer look at a case when we calculate a dot product of a vector with itself: What we can see from this is that this corresponds to the squared norm/magnitude of the vector . The usual notation for the norm of a vector is , so we can write: As a simple example, let’s imagine that we have a 2D vector , where and , as shown in the figure below:
If we calculate the dot product of the vector with itself, we get: which is exactly the Pythagorean theorem in 2D.
Angle between two vectors Besides being useful for calculating norms of vectors, dot product can be used as a measure of similarity. If we imagine two -dimensional vectors and , the angle between them can be calculated using the following formula: When the cosine of the angle between two vectors is equal to , the vectors are perfectly aligned (interpreted as being as similar as possible), and when it is equal to , the vectors are perpendicular (interpreted as being as different as possible). This can be interpreted as a measure of similarity (often called the cosine similarity), which is often used in many areas, such as Natural Language Processing.
Summary We have introduced a new operation that we can use to manipulate vectors, the dot product. The dot product can be used to calculate the norm or magnitude of a vector, as well as the cosine similarity between two vectors. When the cosine similarity is 1, the vectors are perfectly aligned, while a cosine similarity of 0 indicates that the vectors are perpendicular. This is often used in various fields of machine learning.