[math-fun] number arithmetic: power side-channels?
I've been thinking about the *power* side-channel: the ability to watch instantaneous power consumption to guess what a computer is computing. Closely related: the chip temperature side-channel: the ability to watch instantaneous temperature distributions across a chip to guess what a computer is computing. Note that simple power supply filtering doesn't work well enough, as one might be able to watch enough computation to still be able to discern some amount of information. Since many computers would like to keep confidential what they are computing, the question is raised: **Are there computer arithmetic circuits which draw the same sequence of instantaneous power draws *regardless* of the numbers being computed or moved?** For example, some computer circuit may draw slightly more power when a "1" appears on a bus instead of a "0". Under these conditions, it might make sense to drive the bus with both the number and its binary complement, in order to keep the power draw the same, no matter what bit pattern is being operated on. Are there particular number representations and arithmetic circuits (or even *boolean circuits*) whose power consumption is indifferent ("oblivious") to the input bit patterns? Note that CMOS typically utilizes both PNP and NPN transistors in a complementary fashion. However, due to semiconductor physics, these transistors are not 100% complementary -- especially at high clock rates -- and therefore they don't provide as much obliviousness as one would like, so assume for this conversation that we might still have to mirror even CMOS gates.
There are circuit topologies (as opposed to semiconductor technologies) that compute both the true and complement of every signal. To a very good first approximation, the power dissipated is constant in these circuits independent of the data flowing through them. The downside is that power dissipation is constant, but high. ECL logic is an early bipolar version, but similar topologies can be built with NMOS or CMOS technologies. It might be appropriate to use these topologies in security-sensitive applications.
On May 16, 2018, at 1:46 PM, Henry Baker <hbaker1@pipeline.com> wrote:
I've been thinking about the *power* side-channel: the ability to watch instantaneous power consumption to guess what a computer is computing.
Closely related: the chip temperature side-channel: the ability to watch instantaneous temperature distributions across a chip to guess what a computer is computing.
Note that simple power supply filtering doesn't work well enough, as one might be able to watch enough computation to still be able to discern some amount of information.
Since many computers would like to keep confidential what they are computing, the question is raised:
**Are there computer arithmetic circuits which draw the same sequence of instantaneous power draws *regardless* of the numbers being computed or moved?**
For example, some computer circuit may draw slightly more power when a "1" appears on a bus instead of a "0". Under these conditions, it might make sense to drive the bus with both the number and its binary complement, in order to keep the power draw the same, no matter what bit pattern is being operated on.
Are there particular number representations and arithmetic circuits (or even *boolean circuits*) whose power consumption is indifferent ("oblivious") to the input bit patterns?
Note that CMOS typically utilizes both PNP and NPN transistors in a complementary fashion. However, due to semiconductor physics, these transistors are not 100% complementary -- especially at high clock rates -- and therefore they don't provide as much obliviousness as one would like, so assume for this conversation that we might still have to mirror even CMOS gates.
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
Constant power might not be too cool, but may be unavoidable for certain calculations -- e.g., AES encryption/decryption. It would be an interesting theorem if constant power were the *only* solution. Some sort of "tri-stating with memory" might allow lower power consumption for the vast majority of wires that won't be changing on every clock cycle. Having a wire stay at either "0" or "1" for long periods of time doesn't cost anything, but changing the state of a wire does cost the charging/discharging of the capacitance of the wire. In this type of logic, the problem won't be "1's" or "0's", but transitions: "+1", "-1". So the game may be to have a double-width bus where half the bus gets XOR'd with <secret> and the other half gets XOR'd with not-<secret>, so that exactly length-<secret> bits change. I think it may be possible to build systems having both "secret" and "non-secret" bits; the only constraint is that the instantaneous power is independent of the secret bits -- i.e., the power computation function is a *tautology* of the secret bits. This leads us to consider a circuit design system which outputs circuits both for the value to be computed, as well as another circuit which computes the power/energy consumed by the computation. It is up to the circuit design system to make sure that the two sets of circuits are in sync. While we're at it, we can also output a circuit that computes the *timing* for the value-computing circuit. For many situations, this circuit may be trivial, but for others -- e.g., carry-propagation -- this circuit may not be trivial. At 01:47 PM 5/16/2018, Tom Knight wrote:
There are circuit topologies (as opposed to semiconductor technologies) that compute both the true and complement of every signal. To a very good first approximation, the power dissipated is constant in these circuits independent of the data flowing through them. The downside is that power dissipation is constant, but high. ECL logic is an early bipolar version, but similar topologies can be built with NMOS or CMOS technologies. It might be appropriate to use these topologies in security-sensitive applications.
On May 16, 2018, at 1:46 PM, Henry Baker <hbaker1@pipeline.com> wrote: I've been thinking about the *power* side-channel: the ability to watch instantaneous power consumption to guess what a computer is computing.
Closely related: the chip temperature side-channel: the ability to watch instantaneous temperature distributions across a chip to guess what a computer is computing.
Note that simple power supply filtering doesn't work well enough, as one might be able to watch enough computation to still be able to discern some amount of information.
Since many computers would like to keep confidential what they are computing, the question is raised:
**Are there computer arithmetic circuits which draw the same sequence of instantaneous power draws *regardless* of the numbers being computed or moved?**
For example, some computer circuit may draw slightly more power when a "1" appears on a bus instead of a "0". Under these conditions, it might make sense to drive the bus with both the number and its binary complement, in order to keep the power draw the same, no matter what bit pattern is being operated on.
Are there particular number representations and arithmetic circuits (or even *boolean circuits*) whose power consumption is indifferent ("oblivious") to the input bit patterns?
Note that CMOS typically utilizes both PNP and NPN transistors in a complementary fashion. However, due to semiconductor physics, these transistors are not 100% complementary -- especially at high clock rates -- and therefore they don't provide as much obliviousness as one would like, so assume for this conversation that we might still have to mirror even CMOS gates.
Paul Kocher did lots of research along these lines. https://www.rambus.com/introduction-to-differential-power-analysis-and-relat... On Wed, May 16, 2018 at 3:39 PM, Henry Baker <hbaker1@pipeline.com> wrote:
Constant power might not be too cool, but may be unavoidable for certain calculations -- e.g., AES encryption/decryption. It would be an interesting theorem if constant power were the *only* solution.
Some sort of "tri-stating with memory" might allow lower power consumption for the vast majority of wires that won't be changing on every clock cycle. Having a wire stay at either "0" or "1" for long periods of time doesn't cost anything, but changing the state of a wire does cost the charging/discharging of the capacitance of the wire.
In this type of logic, the problem won't be "1's" or "0's", but transitions: "+1", "-1". So the game may be to have a double-width bus where half the bus gets XOR'd with <secret> and the other half gets XOR'd with not-<secret>, so that exactly length-<secret> bits change.
I think it may be possible to build systems having both "secret" and "non-secret" bits; the only constraint is that the instantaneous power is independent of the secret bits -- i.e., the power computation function is a *tautology* of the secret bits.
This leads us to consider a circuit design system which outputs circuits both for the value to be computed, as well as another circuit which computes the power/energy consumed by the computation. It is up to the circuit design system to make sure that the two sets of circuits are in sync.
While we're at it, we can also output a circuit that computes the *timing* for the value-computing circuit. For many situations, this circuit may be trivial, but for others -- e.g., carry-propagation -- this circuit may not be trivial.
At 01:47 PM 5/16/2018, Tom Knight wrote:
There are circuit topologies (as opposed to semiconductor technologies) that compute both the true and complement of every signal. To a very good first approximation, the power dissipated is constant in these circuits independent of the data flowing through them. The downside is that power dissipation is constant, but high. ECL logic is an early bipolar version, but similar topologies can be built with NMOS or CMOS technologies. It might be appropriate to use these topologies in security-sensitive applications.
On May 16, 2018, at 1:46 PM, Henry Baker <hbaker1@pipeline.com> wrote: I've been thinking about the *power* side-channel: the ability to watch instantaneous power consumption to guess what a computer is computing.
Closely related: the chip temperature side-channel: the ability to watch instantaneous temperature distributions across a chip to guess what a computer is computing.
Note that simple power supply filtering doesn't work well enough, as one might be able to watch enough computation to still be able to discern some amount of information.
Since many computers would like to keep confidential what they are computing, the question is raised:
**Are there computer arithmetic circuits which draw the same sequence of instantaneous power draws *regardless* of the numbers being computed or moved?**
For example, some computer circuit may draw slightly more power when a "1" appears on a bus instead of a "0". Under these conditions, it might make sense to drive the bus with both the number and its binary complement, in order to keep the power draw the same, no matter what bit pattern is being operated on.
Are there particular number representations and arithmetic circuits (or even *boolean circuits*) whose power consumption is indifferent ("oblivious") to the input bit patterns?
Note that CMOS typically utilizes both PNP and NPN transistors in a complementary fashion. However, due to semiconductor physics, these transistors are not 100% complementary -- especially at high clock rates -- and therefore they don't provide as much obliviousness as one would like, so assume for this conversation that we might still have to mirror even CMOS gates.
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
-- Mike Stay - metaweta@gmail.com http://www.math.ucr.edu/~mike http://reperiendi.wordpress.com
First step, combinational logic: "Differential signaling": for every signal wire A, there's an equal and opposite signal wire (not A). For every gate, there's a paired dual gate: AND <-> OR; NOT <-> NOT. Blows up every wire to double wires, blows up every gate to double gates. In a perfect (quasi static) world, the total power would remain the same regardless of the parity of any of the input signals. This mirrored logic is a kind of "super CMOS". Yes, this circuitry is twice as expensive as regular circuitry, but it has additional benefits: power side channel resistance, *noise resistance* -- including *radiation hardening*. Remaining problems: bit *changes* are still glaringly obvious; deep levels of combinational logic could get out of *sync*, unless locally synchronized or clocked. So far, this additional expense is easily doable, so I don't know why this mirroring couldn't be routine. At 02:39 PM 5/16/2018, Henry Baker wrote:
Constant power might not be too cool, but may be unavoidable for certain calculations -- e.g., AES encryption/decryption. It would be an interesting theorem if constant power were the *only* solution.
Some sort of "tri-stating with memory" might allow lower power consumption for the vast majority of wires that won't be changing on every clock cycle. Having a wire stay at either "0" or "1" for long periods of time doesn't cost anything, but changing the state of a wire does cost the charging/discharging of the capacitance of the wire.
In this type of logic, the problem won't be "1's" or "0's", but transitions: "+1", "-1". So the game may be to have a double-width bus where half the bus gets XOR'd with <secret> and the other half gets XOR'd with not-<secret>, so that exactly length-<secret> bits change.
I think it may be possible to build systems having both "secret" and "non-secret" bits; the only constraint is that the instantaneous power is independent of the secret bits -- i.e., the power computation function is a *tautology* of the secret bits.
This leads us to consider a circuit design system which outputs circuits both for the value to be computed, as well as another circuit which computes the power/energy consumed by the computation. It is up to the circuit design system to make sure that the two sets of circuits are in sync.
While we're at it, we can also output a circuit that computes the *timing* for the value-computing circuit. For many situations, this circuit may be trivial, but for others -- e.g., carry-propagation -- this circuit may not be trivial.
At 01:47 PM 5/16/2018, Tom Knight wrote:
There are circuit topologies (as opposed to semiconductor technologies) that compute both the true and complement of every signal. To a very good first approximation, the power dissipated is constant in these circuits independent of the data flowing through them. The downside is that power dissipation is constant, but high. ECL logic is an early bipolar version, but similar topologies can be built with NMOS or CMOS technologies. It might be appropriate to use these topologies in security-sensitive applications.
On May 16, 2018, at 1:46 PM, Henry Baker <hbaker1@pipeline.com> wrote: I've been thinking about the *power* side-channel: the ability to watch instantaneous power consumption to guess what a computer is computing.
Closely related: the chip temperature side-channel: the ability to watch instantaneous temperature distributions across a chip to guess what a computer is computing.
Note that simple power supply filtering doesn't work well enough, as one might be able to watch enough computation to still be able to discern some amount of information.
Since many computers would like to keep confidential what they are computing, the question is raised:
**Are there computer arithmetic circuits which draw the same sequence of instantaneous power draws *regardless* of the numbers being computed or moved?**
For example, some computer circuit may draw slightly more power when a "1" appears on a bus instead of a "0". Under these conditions, it might make sense to drive the bus with both the number and its binary complement, in order to keep the power draw the same, no matter what bit pattern is being operated on.
Are there particular number representations and arithmetic circuits (or even *boolean circuits*) whose power consumption is indifferent ("oblivious") to the input bit patterns?
Note that CMOS typically utilizes both PNP and NPN transistors in a complementary fashion. However, due to semiconductor physics, these transistors are not 100% complementary -- especially at high clock rates -- and therefore they don't provide as much obliviousness as one would like, so assume for this conversation that we might still have to mirror even CMOS gates.
Is there any connection with reversability, I wonder idly ... WFL On 5/20/18, Henry Baker <hbaker1@pipeline.com> wrote:
First step, combinational logic:
"Differential signaling": for every signal wire A, there's an equal and opposite signal wire (not A).
For every gate, there's a paired dual gate: AND <-> OR; NOT <-> NOT.
Blows up every wire to double wires, blows up every gate to double gates.
In a perfect (quasi static) world, the total power would remain the same regardless of the parity of any of the input signals.
This mirrored logic is a kind of "super CMOS".
Yes, this circuitry is twice as expensive as regular circuitry, but it has additional benefits: power side channel resistance, *noise resistance* -- including *radiation hardening*.
Remaining problems: bit *changes* are still glaringly obvious; deep levels of combinational logic could get out of *sync*, unless locally synchronized or clocked.
So far, this additional expense is easily doable, so I don't know why this mirroring couldn't be routine.
At 02:39 PM 5/16/2018, Henry Baker wrote:
Constant power might not be too cool, but may be unavoidable for certain calculations -- e.g., AES encryption/decryption. It would be an interesting theorem if constant power were the *only* solution.
Some sort of "tri-stating with memory" might allow lower power consumption for the vast majority of wires that won't be changing on every clock cycle. Having a wire stay at either "0" or "1" for long periods of time doesn't cost anything, but changing the state of a wire does cost the charging/discharging of the capacitance of the wire.
In this type of logic, the problem won't be "1's" or "0's", but transitions: "+1", "-1". So the game may be to have a double-width bus where half the bus gets XOR'd with <secret> and the other half gets XOR'd with not-<secret>, so that exactly length-<secret> bits change.
I think it may be possible to build systems having both "secret" and "non-secret" bits; the only constraint is that the instantaneous power is independent of the secret bits -- i.e., the power computation function is a *tautology* of the secret bits.
This leads us to consider a circuit design system which outputs circuits both for the value to be computed, as well as another circuit which computes the power/energy consumed by the computation. It is up to the circuit design system to make sure that the two sets of circuits are in sync.
While we're at it, we can also output a circuit that computes the *timing* for the value-computing circuit. For many situations, this circuit may be trivial, but for others -- e.g., carry-propagation -- this circuit may not be trivial.
At 01:47 PM 5/16/2018, Tom Knight wrote:
There are circuit topologies (as opposed to semiconductor technologies) that compute both the true and complement of every signal. To a very good first approximation, the power dissipated is constant in these circuits independent of the data flowing through them. The downside is that power dissipation is constant, but high. ECL logic is an early bipolar version, but similar topologies can be built with NMOS or CMOS technologies. It might be appropriate to use these topologies in security-sensitive applications.
On May 16, 2018, at 1:46 PM, Henry Baker <hbaker1@pipeline.com> wrote: I've been thinking about the *power* side-channel: the ability to watch instantaneous power consumption to guess what a computer is computing.
Closely related: the chip temperature side-channel: the ability to watch instantaneous temperature distributions across a chip to guess what a computer is computing.
Note that simple power supply filtering doesn't work well enough, as one might be able to watch enough computation to still be able to discern some amount of information.
Since many computers would like to keep confidential what they are computing, the question is raised:
**Are there computer arithmetic circuits which draw the same sequence of instantaneous power draws *regardless* of the numbers being computed or moved?**
For example, some computer circuit may draw slightly more power when a "1" appears on a bus instead of a "0". Under these conditions, it might make sense to drive the bus with both the number and its binary complement, in order to keep the power draw the same, no matter what bit pattern is being operated on.
Are there particular number representations and arithmetic circuits (or even *boolean circuits*) whose power consumption is indifferent ("oblivious") to the input bit patterns?
Note that CMOS typically utilizes both PNP and NPN transistors in a complementary fashion. However, due to semiconductor physics, these transistors are not 100% complementary -- especially at high clock rates -- and therefore they don't provide as much obliviousness as one would like, so assume for this conversation that we might still have to mirror even CMOS gates.
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
Just having inverse signals doesn't solve the problem. CMOS power consumption is primarily *switching* from 0 to 1 or 1 to 0 (except for leakage of course). So to make this truly work you'd need to somehow have logic that for every 0-1 transition on one gate, there's a lack of transition on a nearby gate. -tom On Sat, May 19, 2018 at 5:59 PM, Henry Baker <hbaker1@pipeline.com> wrote:
First step, combinational logic:
"Differential signaling": for every signal wire A, there's an equal and opposite signal wire (not A).
For every gate, there's a paired dual gate: AND <-> OR; NOT <-> NOT.
Blows up every wire to double wires, blows up every gate to double gates.
In a perfect (quasi static) world, the total power would remain the same regardless of the parity of any of the input signals.
This mirrored logic is a kind of "super CMOS".
Yes, this circuitry is twice as expensive as regular circuitry, but it has additional benefits: power side channel resistance, *noise resistance* -- including *radiation hardening*.
Remaining problems: bit *changes* are still glaringly obvious; deep levels of combinational logic could get out of *sync*, unless locally synchronized or clocked.
So far, this additional expense is easily doable, so I don't know why this mirroring couldn't be routine.
At 02:39 PM 5/16/2018, Henry Baker wrote:
Constant power might not be too cool, but may be unavoidable for certain calculations -- e.g., AES encryption/decryption. It would be an interesting theorem if constant power were the *only* solution.
Some sort of "tri-stating with memory" might allow lower power consumption for the vast majority of wires that won't be changing on every clock cycle. Having a wire stay at either "0" or "1" for long periods of time doesn't cost anything, but changing the state of a wire does cost the charging/discharging of the capacitance of the wire.
In this type of logic, the problem won't be "1's" or "0's", but transitions: "+1", "-1". So the game may be to have a double-width bus where half the bus gets XOR'd with <secret> and the other half gets XOR'd with not-<secret>, so that exactly length-<secret> bits change.
I think it may be possible to build systems having both "secret" and "non-secret" bits; the only constraint is that the instantaneous power is independent of the secret bits -- i.e., the power computation function is a *tautology* of the secret bits.
This leads us to consider a circuit design system which outputs circuits both for the value to be computed, as well as another circuit which computes the power/energy consumed by the computation. It is up to the circuit design system to make sure that the two sets of circuits are in sync.
While we're at it, we can also output a circuit that computes the *timing* for the value-computing circuit. For many situations, this circuit may be trivial, but for others -- e.g., carry-propagation -- this circuit may not be trivial.
At 01:47 PM 5/16/2018, Tom Knight wrote:
There are circuit topologies (as opposed to semiconductor technologies) that compute both the true and complement of every signal. To a very good first approximation, the power dissipated is constant in these circuits independent of the data flowing through them. The downside is that power dissipation is constant, but high. ECL logic is an early bipolar version, but similar topologies can be built with NMOS or CMOS technologies. It might be appropriate to use these topologies in security-sensitive applications.
On May 16, 2018, at 1:46 PM, Henry Baker <hbaker1@pipeline.com> wrote: I've been thinking about the *power* side-channel: the ability to watch instantaneous power consumption to guess what a computer is computing.
Closely related: the chip temperature side-channel: the ability to watch instantaneous temperature distributions across a chip to guess what a computer is computing.
Note that simple power supply filtering doesn't work well enough, as one might be able to watch enough computation to still be able to discern some amount of information.
Since many computers would like to keep confidential what they are computing, the question is raised:
**Are there computer arithmetic circuits which draw the same sequence of instantaneous power draws *regardless* of the numbers being computed or moved?**
For example, some computer circuit may draw slightly more power when a "1" appears on a bus instead of a "0". Under these conditions, it might make sense to drive the bus with both the number and its binary complement, in order to keep the power draw the same, no matter what bit pattern is being operated on.
Are there particular number representations and arithmetic circuits (or even *boolean circuits*) whose power consumption is indifferent ("oblivious") to the input bit patterns?
Note that CMOS typically utilizes both PNP and NPN transistors in a complementary fashion. However, due to semiconductor physics, these transistors are not 100% complementary -- especially at high clock rates -- and therefore they don't provide as much obliviousness as one would like, so assume for this conversation that we might still have to mirror even CMOS gates.
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
-- -- http://cube20.org/ -- http://golly.sf.net/ --
Not long ago someone had found a way to obfuscate programs into essentially irreversible unintelligibility while keeping the behavior intact. Unfortunately, in the paper this achievement required impractical degrees of obfuscation. Perhaps there is a way to obfuscate software just as effectively while losing only one order of magnitude in performance. Do you think that approach would thwart power side channel attacks? On 5/16/18 10:46 , Henry Baker wrote:
I've been thinking about the *power* side-channel: the ability to watch instantaneous power consumption to guess what a computer is computing.
Closely related: the chip temperature side-channel: the ability to watch instantaneous temperature distributions across a chip to guess what a computer is computing.
Note that simple power supply filtering doesn't work well enough, as one might be able to watch enough computation to still be able to discern some amount of information.
Since many computers would like to keep confidential what they are computing, the question is raised:
**Are there computer arithmetic circuits which draw the same sequence of instantaneous power draws *regardless* of the numbers being computed or moved?**
For example, some computer circuit may draw slightly more power when a "1" appears on a bus instead of a "0". Under these conditions, it might make sense to drive the bus with both the number and its binary complement, in order to keep the power draw the same, no matter what bit pattern is being operated on.
Are there particular number representations and arithmetic circuits (or even *boolean circuits*) whose power consumption is indifferent ("oblivious") to the input bit patterns?
Note that CMOS typically utilizes both PNP and NPN transistors in a complementary fashion. However, due to semiconductor physics, these transistors are not 100% complementary -- especially at high clock rates -- and therefore they don't provide as much obliviousness as one would like, so assume for this conversation that we might still have to mirror even CMOS gates.
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun .
Reference? On Thu, May 17, 2018 at 17:56 Andres Valloud < avalloud@smalltalk.comcastbiz.net> wrote:
Not long ago someone had found a way to obfuscate programs into essentially irreversible unintelligibility while keeping the behavior intact. Unfortunately, in the paper this achievement required impractical degrees of obfuscation. Perhaps there is a way to obfuscate software just as effectively while losing only one order of magnitude in performance. Do you think that approach would thwart power side channel attacks?
On 5/16/18 10:46 , Henry Baker wrote:
I've been thinking about the *power* side-channel: the ability to watch instantaneous power consumption to guess what a computer is computing.
Closely related: the chip temperature side-channel: the ability to watch instantaneous temperature distributions across a chip to guess what a computer is computing.
Note that simple power supply filtering doesn't work well enough, as one might be able to watch enough computation to still be able to discern some amount of information.
Since many computers would like to keep confidential what they are computing, the question is raised:
**Are there computer arithmetic circuits which draw the same sequence of instantaneous power draws *regardless* of the numbers being computed or moved?**
For example, some computer circuit may draw slightly more power when a "1" appears on a bus instead of a "0". Under these conditions, it might make sense to drive the bus with both the number and its binary complement, in order to keep the power draw the same, no matter what bit pattern is being operated on.
Are there particular number representations and arithmetic circuits (or even *boolean circuits*) whose power consumption is indifferent ("oblivious") to the input bit patterns?
Note that CMOS typically utilizes both PNP and NPN transistors in a complementary fashion. However, due to semiconductor physics, these transistors are not 100% complementary -- especially at high clock rates -- and therefore they don't provide as much obliviousness as one would like, so assume for this conversation that we might still have to mirror even CMOS gates.
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun .
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
One way you can do that is to apply fully homomorphic encryption to a machine emulator. The result is a program that can run encrypted programs without them ever being decrypted: https://en.wikipedia.org/wiki/Homomorphic_encryption#Fully_homomorphic_encry... Moreover, the encryption is asymmetric, and it is computationally infeasible to extract the private-key from the encrypted emulator (because it's encrypted with itself!). The encrypted program runs on the encrypted emulator in linear time, where the linear overhead factor is ridiculously immense (maybe 10^9 or so). Best wishes, Adam P. Goucher
Sent: Friday, May 18, 2018 at 2:04 AM From: "Tom Duff" <td@pixar.com> To: andres.valloud@gmail.com, math-fun <math-fun@mailman.xmission.com> Subject: Re: [math-fun] number arithmetic: power side-channels?
Reference?
On Thu, May 17, 2018 at 17:56 Andres Valloud < avalloud@smalltalk.comcastbiz.net> wrote:
Not long ago someone had found a way to obfuscate programs into essentially irreversible unintelligibility while keeping the behavior intact. Unfortunately, in the paper this achievement required impractical degrees of obfuscation. Perhaps there is a way to obfuscate software just as effectively while losing only one order of magnitude in performance. Do you think that approach would thwart power side channel attacks?
On 5/16/18 10:46 , Henry Baker wrote:
I've been thinking about the *power* side-channel: the ability to watch instantaneous power consumption to guess what a computer is computing.
Closely related: the chip temperature side-channel: the ability to watch instantaneous temperature distributions across a chip to guess what a computer is computing.
Note that simple power supply filtering doesn't work well enough, as one might be able to watch enough computation to still be able to discern some amount of information.
Since many computers would like to keep confidential what they are computing, the question is raised:
**Are there computer arithmetic circuits which draw the same sequence of instantaneous power draws *regardless* of the numbers being computed or moved?**
For example, some computer circuit may draw slightly more power when a "1" appears on a bus instead of a "0". Under these conditions, it might make sense to drive the bus with both the number and its binary complement, in order to keep the power draw the same, no matter what bit pattern is being operated on.
Are there particular number representations and arithmetic circuits (or even *boolean circuits*) whose power consumption is indifferent ("oblivious") to the input bit patterns?
Note that CMOS typically utilizes both PNP and NPN transistors in a complementary fashion. However, due to semiconductor physics, these transistors are not 100% complementary -- especially at high clock rates -- and therefore they don't provide as much obliviousness as one would like, so assume for this conversation that we might still have to mirror even CMOS gates.
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun .
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
For example, https://eprint.iacr.org/2014/779.pdf. On 5/17/18 18:04 , Tom Duff wrote:
Reference?
On Thu, May 17, 2018 at 17:56 Andres Valloud <avalloud@smalltalk.comcastbiz.net <mailto:avalloud@smalltalk.comcastbiz.net>> wrote:
Not long ago someone had found a way to obfuscate programs into essentially irreversible unintelligibility while keeping the behavior intact. Unfortunately, in the paper this achievement required impractical degrees of obfuscation. Perhaps there is a way to obfuscate software just as effectively while losing only one order of magnitude in performance. Do you think that approach would thwart power side channel attacks?
On 5/16/18 10:46 , Henry Baker wrote: > I've been thinking about the *power* side-channel: > the ability to watch instantaneous power consumption > to guess what a computer is computing. > > Closely related: the chip temperature side-channel: > the ability to watch instantaneous temperature > distributions across a chip to guess what a computer > is computing. > > Note that simple power supply filtering doesn't > work well enough, as one might be able to watch > enough computation to still be able to discern > some amount of information. > > Since many computers would like to keep confidential > what they are computing, the question is raised: > > **Are there computer arithmetic circuits which draw > the same sequence of instantaneous power draws > *regardless* of the numbers being computed or moved?** > > For example, some computer circuit may draw slightly > more power when a "1" appears on a bus instead of a > "0". Under these conditions, it might make sense to > drive the bus with both the number and its binary > complement, in order to keep the power draw the same, > no matter what bit pattern is being operated on. > > Are there particular number representations and > arithmetic circuits (or even *boolean circuits*) > whose power consumption is indifferent ("oblivious") > to the input bit patterns? > > Note that CMOS typically utilizes both PNP and NPN > transistors in a complementary fashion. However, > due to semiconductor physics, these transistors are > not 100% complementary -- especially at high clock > rates -- and therefore they don't provide as much > obliviousness as one would like, so assume for > this conversation that we might still have to mirror > even CMOS gates. > > > _______________________________________________ > math-fun mailing list > math-fun@mailman.xmission.com <mailto:math-fun@mailman.xmission.com> > https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun > . >
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com <mailto:math-fun@mailman.xmission.com> https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
participants (8)
-
Adam P. Goucher -
Andres Valloud -
Fred Lunnon -
Henry Baker -
Mike Stay -
Tom Duff -
Tom Knight -
Tomas Rokicki