Computational load increases significantly in Gen AI use cases, where models need to process large datasets. For example, training a simple machine learning model on similarly encrypted data can take several orders of magnitude longer than regular training.
This challenge is exacerbated by encryption schemes such as Cheon-Kim-Kim-Song (CKKS) or Brakerski/ Fan-VerCauteren (BFV), which need to handle issues such as noise accumulation and bootstrapping, especially in cases involving deep computation such as neural networks. inference. .
Efforts to reduce these computing requirements include the development of advanced libraries such as IBM’s HElib, Microsoft SEAL, and PALISADE, which aim to reduce processing times by refining encryption algorithms and implementation methods. Research also focuses on hybrid methods, such as combining HE with lightweight encryption such as AES to balance security and speed.
These innovations aim to make HE more effective in real-world applications but getting it to the most demanding Gen AI scenarios remains a significant hurdle. If they prove effective, they could drive Gen AI adoption to unprecedented levels.
Source link