Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Investigate "Ciphertext-Ciphertext Matrix Multiplication: Fast for Large Matrices" #1618

Open
j2kun opened this issue Mar 25, 2025 · 2 comments
Labels
research synthesis Reading papers to figure out which ideas can be incorporated

Comments

@j2kun
Copy link
Collaborator

j2kun commented Mar 25, 2025

This was the same idea that Craig Gentry gave in his keynote at FHE.org 2025. The idea is that matmul reduces to 4 plaintext matmuls of the same size, combine with a bunch of other key switching operations that don't exceed the matmul cost.

https://eprint.iacr.org/2025/448

Jai Hyun Park
CryptoLab Inc., Lyon, France
Abstract. Matrix multiplication of two encrypted matrices (CC-MM)
is a key challenge for privacy-preserving machine learning applications.
As modern machine learning models focus on scalability, fast CC-MM
on large datasets is increasingly in demand.
In this work, we present a CC-MM algorithm for large matrices. The
algorithm consists of plaintext matrix multiplications (PP-MM) and ci-
phertext matrix transpose algorithms (C-MT). We propose a fast C-MT
algorithm, which is computationally inexpensive compared to PP-MM.
By leveraging high-performance BLAS libraries to optimize PP-MM, we
implement large-scale CC-MM with substantial performance improve-
ments. Furthermore, we propose lightweight algorithms, significantly re-
ducing the key size from 1 960 MB to 1.57 MB for CC-MM with com-
parable efficiency.
In a single-thread implementation, the C-MT algorithm takes 0.76 sec-
onds to transpose a 2 048 × 2 048 encrypted matrix. The CC-MM al-
gorithm requires 85.2 seconds to multiply two 4 096 × 4 096 encrypted
matrices. For large matrices, our algorithm outperforms the state-of-the-
art CC-MM method from Jiang-Kim-Lauter-Song [CCS’18] by a factor
of over 800.

@j2kun j2kun added the research synthesis Reading papers to figure out which ideas can be incorporated label Mar 25, 2025
@asraa
Copy link
Collaborator

asraa commented Mar 25, 2025

Dupe of #1587?

@j2kun
Copy link
Collaborator Author

j2kun commented Mar 27, 2025

Ha, yes. I saw the other paper go on IACR but didn't fully understand what it was doing. Then I forgot and when I saw Craig's talk and the discussions around it, I got excited.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
research synthesis Reading papers to figure out which ideas can be incorporated
Projects
None yet
Development

No branches or pull requests

2 participants