## What are Tensors? Since all forms of data can be represented as numbers, the job of a computer (at a certain level of abstraction) can be considered as storing and manipulating lots of numbers. Different arrangements of numbers go by different names: - a line of numbers is a vector, - a 2D grid of numbers s a matrix, - a 3D grid is a tensor. To extend this idea, we need to change our thinking. Fortunately, new names aren’t needed – they are all tensors from this point! Imagine a so-called “human computer,” a person hired to do numerical calculations prior to the invention of electronic computers. Given the task of handling various numbers in an orderly fashion, they could invent a system whereby indices label, say, the row, column, page, filing cabinet, etc., where the number is written down and stored. In this way, the indices do not indicate spatial dimensions but rather spatial information. When we return to electronic computers, we can retain this perspective because ultimately, the indices indicate where the computer stores the numbers in its memory. Programmers, who are rarely concerned with memory allocation, focus on the idea that an appropriately long list of indices can refer to the stored numbers. This level of abstraction may seem to leave us with no visual tools at our disposal. ![[Pasted image 20240406155516.png]] The hilbert space is equipped with a natural tensor-product decomposition into subsystems. ### Matrix Multiplications A high-dimensional matrix - Can represent Matrix Multiplication Algorithms - Computers need more time to multiply than to add or subtract. - It’s easy to _compose_ a tensor from three matrices. However, to _decompose_ a tensor (the reverse operation) is not straightforward; the procedure could result in any of thousands of potential sets of matrices. - Any valid decomposition of the tensor into three matrices represents a valid algorithm for matrix multiplication. The number of columns equals the number of multiplications required