Semi-analytical solution of Brent equations

封面

如何引用文章

全文:

开放存取 开放存取
受限制的访问 ##reader.subscriptionAccessGranted##
受限制的访问 订阅存取

详细

A parametrization of Brent equations is proposed which admit for a several times reduction of the number of unknowns and equations. The arising equations are solved numerically, and for the resulting fast matrix multiplication algorithms many known values of rank are reproduced and even improved, in particular, the designs (4,4,4;48) and (2,4,5;32) are found.

作者简介

I. Kaporin

FRC CSC RAS

编辑信件的主要联系方式.
Email: igorkaporin@mail.ru
俄罗斯联邦, Moscow

参考

  1. Brent R.P. Algorithms for matrix multiplication. (Report No. STAN-CS-70-157). Stanford Univ. CA Dept. of Computer Science, 1970, 58p.
  2. Strassen V. Gaussian elimination is not optimal // Numer. Math. 1969. V. 13. № 4. P. 354–356.
  3. Смирнов А.В. О билинейной сложности и практических алгоритмах умножения матриц // ЖВМ и МФ. 2013. Т. 53. № 12. С. 1970–1984.
  4. Beniamini G. et al., Sparsifying the operators of fast matrix multiplication algorithms. arXiv preprint arXiv:2008.03759 (2020) https://arxiv.org/pdf/2008.03759.pdf
  5. Ballard G., Ikenmeyer C., Landsberg J.M., Ryder N. The geometry of rank decompositions of matrix multiplication II: 3×3 matrices // J. of Pure and Applied Algebra. 2019. V. 223. № 8. P. 3205–3224.
  6. Laderman J.D. A noncommutative algorithm for multiplying 3x3 matrices using 23 multiplications // Bull. Amer. Math. Soc. 1976. V. 82. № 1. P. 126–128.
  7. Kaporin I. A derivative-free nonlinear least squares solver. In: Olenev N.N., Evtushenko Y.G., Jacimovic M., Khachay M., Malkova V. (eds.) Optimization and Applications. OPTIMA 2021. Lecture Notes in Computer Science, V. 13078. P. 217–230. Springer, Cham, 2021. https://doi.org/10.1007/978-3-030-91059-4_16
  8. Kaporin I. Verifying the correctness of the (4,4,4;48) matrix multiplication scheme with complex coefficients exact up to the floating point tolerance // 2024. URL: https://cloud.mail.ru/public/Yfij/ErDxopqBh
  9. Hopcroft J.E., Kerr L.R. On minimizing the number of multiplications necessary for matrix multiplication // SIAM Journal on Appl. Math. 1971. V. 20. № 1 P. 30–36.
  10. Berger G.O., Absil P.A., De Lathauwer L., Jungers R.M., Van Barel M. Equivalent polyadic decompositions of matrix multiplication tensors // J. of Comput. and Appl. Math. 2022. V. 406. P. 113941. https://doi.org/10.1016/j.cam.2021.113941
  11. Fawzi A. et al. Discovering faster matrix multiplication algorithms with reinforcement learning // Nature. 2022. V. 610. № 7930. P. 47–53.
  12. Li X., Zhang L., Ke Y. Deflation conjecture and local dimensions of Brent equations // arXiv preprint arXiv:2310.11686. 2023 Oct 18.
  13. Ballard G., Weissenberger J., Zhang L. Accelerating neural network training using arbitrary precision approximating matrix multiplication algorithms // 50th International Conference on Parallel Processing Workshop 2021 Aug 9. P. 1–8. https://doi.org/10.1145/3458744.3474050

补充文件

附件文件
动作
1. JATS XML

版权所有 © Russian Academy of Sciences, 2024