Skip to main content

5 docs tagged with "matrices"

View All Tags

Broadcasting

Broadcasting is something special that computer scientists make use of when working with tensors such as scalars, vectors and matrices. It's very useful for computer scientists but it is not really mathematical. It allows libraries like numpy to perform arithmetic operations (element-wise addition or multiplication, also known as hadamard product) although the arrays have different shapes. If the arrays meet certain constraints then the smaller array is “broadcast” across the larger array so that they have compatible shapes to perform the operation. Let's look at some simple examples of how broadcasting in numpy works. We might often want to perform the element-wise multiplication between two different arrays, in this case, vectors of the same shape this works fine.

Eigenvalues And Eigenvectors

Before we talk about eigenvalues and eigenvectors let us just remind ourselves that vectors can be transformed using matrices. For example we can rotate a vector using the rotation matrix:

Hadamard Product

The hadamard product is how some people might first think matrix multiplication works, which is wrong.

Singular Value Decomposition - SVD

The eigendecomposition only works for square matrices, the singular value decomposition, short SVD, is a generalization of the eigendecomposition allowing it to be used for rectangular matrices. Singular value decomposition uses 3 matrices just like the eigendecomposition.