cannibalismo t1_iurgjhi wrote
We haven't had significantly better algorithms since 1969? That seems far fetched?
SpaceDepix OP t1_iurh3kw wrote
As per the official deepmind article (source provided in my article):
“In our paper, published today in Nature, we introduce AlphaTensor, the first artificial intelligence (AI) system for discovering novel, efficient, and provably correct algorithms for fundamental tasks such as matrix multiplication. This sheds light on a 50-year-old open question in mathematics about finding the fastest way to multiply two matrices.”
“…Beyond this example, AlphaTensor’s algorithm improves on Strassen’s two-level algorithm in a finite field for the first time since its discovery 50 years ago. These algorithms for multiplying small matrices can be used as primitives to multiply much larger matrices of arbitrary size.”
blueSGL t1_iuriw6r wrote
matrix multiplications require doing additions (and subtractions) and multiplications.
GPUs can do additions (and subtractions) faster than multiplications.
by rejiggering the way the matrix multiplication is written you can use less multiplications and more additions thus it runs faster on the same hardware.
https://en.wikipedia.org/wiki/Strassen_algorithm
>Volker Strassen first published this algorithm in 1969
.....
>In late-2022, AlphaTensor was able to construct a faster algorithm for multiplying matrices for small sizes (e.g. specifically over the field Z 2 \mathbb {Z} _{2} 4x4 matrices in 47 multiplications versus 49 by the Strassen algorithm, or 64 using the naive algorithm).[2] AlphaTensor's results of 96 multiplications for 5x5 matrices over any field (compared to 98 by the Strassen algorithm) was reduced to 95 a week later with further human optimization.
Viewing a single comment thread. View all comments