Author: 0x_Todd; Source: X@0x_Todd
The market has not been good recently, and I finally have enough time to continue sharing some new technical routes. Although the encryption market in 2024 is not as turbulent as in the past, there are still some new technologies trying to mature, such as the topic we are going to talk about today: "FHE / Fully Homomorphic Encryption". V God also published an article about FHE in May this year, which is recommended for interested friends to read.
So what kind of technology is FHE?
To understand the awkward term FHE fully homomorphic encryption, you must first understand what "encryption", what "homomorphic", and why "full".
1. What is encryption?
Ordinary encryption is the most familiar to everyone. For example, Alice wants to send a message to Bob, such as "1314 520".
Now, if a third party C is required to deliver the message and the information must be kept confidential, then it is very simple - just encrypt each number by x2, such as "2628 1040".
When Bob receives it, he divides each number by 2 in turn, and decrypts it to find out that Alice is saying "1314 520".
See, the two people have completed the information transmission through symmetric encryption, while hiring C to help but not knowing the information. Generally, in spy movies, the communication between two liaisons will not exceed this scope.
2. What is homomorphic encryption?
Now Alice's requirements are more difficult: for example, Alice is only 7 years old; Alice can only calculate the simplest arithmetic such as x2 and ÷2, and does not understand other operations.
Okay, now suppose Alice has to pay the electricity bill. Alice's monthly electricity bill is 400 yuan, and she has been in arrears for 12 months.
However, 400*12= how many, this question exceeds the calculation range of Alice, who is only 7 years old, and she cannot do such a complicated calculation.
However, she does not want others to know how much her family's electricity bill is / how many months, because this is sensitive information.
Therefore, Alice asks C to help calculate without trusting C.
Because she only knows x2 ÷2, she uses x2 multiplication to make a simple encryption for her number, so she tells C to calculate 800x24=, that is: (400x2) multiplied by (12x2).
C is an adult with a strong computing brain. He quickly calculated 800*24=19200 and told Alice the number. Then, Alice used the result, which is 19200÷2÷2, and soon found out that she had to pay 4800 yuan for water.
See? This is the simplest multiplication homomorphic encryption. 800*24 is just a mapping of 400*12. The shape before and after the transformation is actually the same, so it is called "homomorphic".
This encryption method realizes that someone can entrust an untrusted entity to calculate the result, but can ensure that his sensitive numbers are not leaked.
3. Why does "homomorphic encryption" need to be "full"?
However, the problem just now is only in the ideal world. The problem in the real world is not so simple. Not everyone is 7 years old, or as honest as C.
Let's assume a very bad situation. For example, C may try to reverse the deduction. C can also decipher that Alice wants to calculate 400 and 12 through exhaustive method.
At this time, "full homomorphic encryption" is needed to solve it.
Alice multiplies each number by 2, and this 2 can be regarded as a noise. If the noise is too little, it is easy to be cracked by C.
Therefore, Alice can introduce an addition operation on top of multiplication.
Of course, it would be best if the noise is like a main road intersection at 9 o'clock in the morning, so that the difficulty of cracking C is harder than climbing to the sky.
Therefore, Alice can multiply 4 times and add 8 times, so that the probability of cracking C is greatly reduced.
However, Alice is still only "partially" homomorphic encryption, that is: (1) the content she encrypts can only be used for a specific part of the problem; (2) she can only use a specific part of the operation rules, because the number of additions and multiplications cannot be too many (generally no more than 15 times).
And "full" means that Alice should be allowed to perform any number of additions and multiplications on a polynomial, so that she can entrust a third party to complete the calculation and still get the correct result after decryption.
A super long polynomial can express almost all mathematical problems in the world, not just the problem of calculating electricity bills for 7-year-olds.
Adding any number of encryptions, it is almost impossible for C to spy on private data, and truly achieves "both".
Therefore, the technology of "fully homomorphic encryption" has always been a pearl on the holy grail of cryptography.
In fact, the technology of homomorphic encryption only supported "partial homomorphic encryption" until 2009.
The new ideas proposed by scholars such as Gentry in 2009 opened the door to the possibility of fully homomorphic encryption. Interested readers can also move to this paper.
Many friends are still confused about the application scenarios of this technology. In what scenarios will fully homomorphic encryption (FHE) technology be needed?
For example, AI.
As we all know, a powerful AI needs enough data to feed it, but the privacy value of many data is too high. So can FHE achieve the "both" of this problem?
The answer is yes.
You can: (1) Encrypt your sensitive data using FHE; (2) Use the encrypted data to give AI calculations; (3) Then AI spits out a bunch of garbled code that no one can understand.
Unsupervised AI can achieve this because these data are essentially vectors. AI, especially generative AI such as GPT, does not understand the words we input to it at all. It just "predicts" the most appropriate answer through the vector.
However, since this mess of code follows certain mathematical rules and you are the one who encrypted it,
then: (4) you can disconnect from the network and decrypt these messed up codes locally, just like Alice; (5) Furthermore, you have achieved: let AI use huge computing power to help you complete the calculation without handling your sensitive data at all.
But current AI cannot do this and must give up privacy. Think about everything you input to GPT in plain text! To achieve this, FHE is indispensable.
This is the root of the natural fit between AI and FHE. Thousands of words can be summed up in one word: both.
Since FHE and AI are linked, spanning the two major fields of encryption and AI, it naturally receives extra favor. There are many projects about FHE, such as Zama, Privasea, Mind Network, Fhenix, Sunscreen, etc., and the directions of FHE application are also creative.
Today, let's take one of the projects @Privasea_ai for analysis.
This is an FHE project led by Binance. Its white paper describes a very appropriate scenario, such as face recognition.
Both: the machine computing power can determine whether the person is real; and: the machine does not handle any sensitive facial information.
The introduction of FHE can effectively solve this problem.
However, if you really want to do FHE calculations in the real world, you need a very large amount of computing power. After all, Alice needs to do "arbitrary" addition and multiplication encryption. Whether it is calculation, encryption, or decryption, it is a process that consumes a lot of computing power.
Therefore, Privasea needs to build a powerful computing power network and supporting facilities. Therefore, Privasea has proposed a PoW+PoS network architecture to solve the problem of this computing power network.
Recently, Privasea has just announced its own PoW hardware, called WorkHeart USB, which can be understood as one of the supporting facilities of Privasea's computing power network. Of course, you can simply understand it as a mining machine.
The initial price is 0.2 ETH, which can mine 6.66% of the total tokens of the network.
There is also a PoS-like asset called StarFuel NFT, which can be understood as a "work certificate", with a total of 5,000.
The initial price is also 0.2 ETH, and you can get 0.75% of the total tokens of the network (through airdrops).
This NFT is also interesting. It is a PoS-like, but not a real PoS. It is trying to avoid the question of "Is PoS a security in the United States?"
This NFT supports users to mortgage Privasea tokens, but it does not directly generate PoS income, but doubles the mining efficiency of your bound USB device, so it is a disguised PoS.
PS: I have invested in this project before, so I have a discounted mint early bird invitation code siA7P0. If you are interested, please get it at https://nft.privasea.ai/WorkHeartNFT
Back to the topic, if AI can really popularize FHE technology on a large scale, it will be a blessing for AI itself. You should know that many countries now focus on data security and data privacy in regulating AI.
Even, to give an inappropriate example, in the Russian-Ukrainian war, some Russian military forces are trying to use AI, but considering the American background of a large number of AI companies, the intelligence department will probably be riddled with holes.
But if AI is not used, it will naturally fall behind by a large margin. Even if the gap may not be big now, given another 10 years, perhaps we can't imagine a world without AI.
Therefore, data privacy exists everywhere in our lives, from wars between two countries to face unlocking of mobile phones.
In the era of AI, if FHE technology can be truly mature, it will undoubtedly be the last line of defense for mankind.