One way you can do that is to apply fully homomorphic encryption to a machine emulator. The result is a program that can run encrypted programs without them ever being decrypted: https://en.wikipedia.org/wiki/Homomorphic_encryption#Fully_homomorphic_encry... Moreover, the encryption is asymmetric, and it is computationally infeasible to extract the private-key from the encrypted emulator (because it's encrypted with itself!). The encrypted program runs on the encrypted emulator in linear time, where the linear overhead factor is ridiculously immense (maybe 10^9 or so). Best wishes, Adam P. Goucher
Sent: Friday, May 18, 2018 at 2:04 AM From: "Tom Duff" <td@pixar.com> To: andres.valloud@gmail.com, math-fun <math-fun@mailman.xmission.com> Subject: Re: [math-fun] number arithmetic: power side-channels?
Reference?
On Thu, May 17, 2018 at 17:56 Andres Valloud < avalloud@smalltalk.comcastbiz.net> wrote:
Not long ago someone had found a way to obfuscate programs into essentially irreversible unintelligibility while keeping the behavior intact. Unfortunately, in the paper this achievement required impractical degrees of obfuscation. Perhaps there is a way to obfuscate software just as effectively while losing only one order of magnitude in performance. Do you think that approach would thwart power side channel attacks?
On 5/16/18 10:46 , Henry Baker wrote:
I've been thinking about the *power* side-channel: the ability to watch instantaneous power consumption to guess what a computer is computing.
Closely related: the chip temperature side-channel: the ability to watch instantaneous temperature distributions across a chip to guess what a computer is computing.
Note that simple power supply filtering doesn't work well enough, as one might be able to watch enough computation to still be able to discern some amount of information.
Since many computers would like to keep confidential what they are computing, the question is raised:
**Are there computer arithmetic circuits which draw the same sequence of instantaneous power draws *regardless* of the numbers being computed or moved?**
For example, some computer circuit may draw slightly more power when a "1" appears on a bus instead of a "0". Under these conditions, it might make sense to drive the bus with both the number and its binary complement, in order to keep the power draw the same, no matter what bit pattern is being operated on.
Are there particular number representations and arithmetic circuits (or even *boolean circuits*) whose power consumption is indifferent ("oblivious") to the input bit patterns?
Note that CMOS typically utilizes both PNP and NPN transistors in a complementary fashion. However, due to semiconductor physics, these transistors are not 100% complementary -- especially at high clock rates -- and therefore they don't provide as much obliviousness as one would like, so assume for this conversation that we might still have to mirror even CMOS gates.
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun .
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun