Classical Memory enough to store states up to 40 qubits quantum system?
Clash Royale CLAN TAG#URR8PPP
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty margin-bottom:0;
up vote
10
down vote
favorite
As a part of a discussion with my 'classical' friend, he insisted that making a a state machine for calculating the outcome of quantum computer is possible; so, simply calculate the outcomes of (known) algorithms on supercomputers and store their results in a Look-up table. (Something like storing the truth table).
So, why do people work on quantum simulators (say, capable up to 40 qubits); which calculate the result every time?! Simply (hypothetically) use the supercomputers of the world (say capable up to 60 qubits); calculate the result for $2^60$ input cases, store their result and use it as reference? How can I convince him it's not possible?
Note: this is for known quantum algorithms and their known circuit implementations.
quantum-algorithms quantum-computer quantum-simulation
add a comment |Â
up vote
10
down vote
favorite
As a part of a discussion with my 'classical' friend, he insisted that making a a state machine for calculating the outcome of quantum computer is possible; so, simply calculate the outcomes of (known) algorithms on supercomputers and store their results in a Look-up table. (Something like storing the truth table).
So, why do people work on quantum simulators (say, capable up to 40 qubits); which calculate the result every time?! Simply (hypothetically) use the supercomputers of the world (say capable up to 60 qubits); calculate the result for $2^60$ input cases, store their result and use it as reference? How can I convince him it's not possible?
Note: this is for known quantum algorithms and their known circuit implementations.
quantum-algorithms quantum-computer quantum-simulation
2
I'd suggest a more extreme 'classical' approach: in the end of the day any quantum algorithm is a unitary transformation applied to a n-qubit system; it can be described by $2^ntimes 2^n$ unitary matrix; so we can create a list of known quantum algorithms, described as unitary matrices; and running an algorithm is simply multiplication of the matrix by an input vector, and it would be fast. Of course there are memory requirements to consider...
â kludg
Aug 20 at 14:19
Exactly. And I believe the memory requirement would steeply increase as the n increases.
â epr_guy
Aug 20 at 14:24
add a comment |Â
up vote
10
down vote
favorite
up vote
10
down vote
favorite
As a part of a discussion with my 'classical' friend, he insisted that making a a state machine for calculating the outcome of quantum computer is possible; so, simply calculate the outcomes of (known) algorithms on supercomputers and store their results in a Look-up table. (Something like storing the truth table).
So, why do people work on quantum simulators (say, capable up to 40 qubits); which calculate the result every time?! Simply (hypothetically) use the supercomputers of the world (say capable up to 60 qubits); calculate the result for $2^60$ input cases, store their result and use it as reference? How can I convince him it's not possible?
Note: this is for known quantum algorithms and their known circuit implementations.
quantum-algorithms quantum-computer quantum-simulation
As a part of a discussion with my 'classical' friend, he insisted that making a a state machine for calculating the outcome of quantum computer is possible; so, simply calculate the outcomes of (known) algorithms on supercomputers and store their results in a Look-up table. (Something like storing the truth table).
So, why do people work on quantum simulators (say, capable up to 40 qubits); which calculate the result every time?! Simply (hypothetically) use the supercomputers of the world (say capable up to 60 qubits); calculate the result for $2^60$ input cases, store their result and use it as reference? How can I convince him it's not possible?
Note: this is for known quantum algorithms and their known circuit implementations.
quantum-algorithms quantum-computer quantum-simulation
edited Aug 20 at 14:25
asked Aug 20 at 10:28
epr_guy
1139
1139
2
I'd suggest a more extreme 'classical' approach: in the end of the day any quantum algorithm is a unitary transformation applied to a n-qubit system; it can be described by $2^ntimes 2^n$ unitary matrix; so we can create a list of known quantum algorithms, described as unitary matrices; and running an algorithm is simply multiplication of the matrix by an input vector, and it would be fast. Of course there are memory requirements to consider...
â kludg
Aug 20 at 14:19
Exactly. And I believe the memory requirement would steeply increase as the n increases.
â epr_guy
Aug 20 at 14:24
add a comment |Â
2
I'd suggest a more extreme 'classical' approach: in the end of the day any quantum algorithm is a unitary transformation applied to a n-qubit system; it can be described by $2^ntimes 2^n$ unitary matrix; so we can create a list of known quantum algorithms, described as unitary matrices; and running an algorithm is simply multiplication of the matrix by an input vector, and it would be fast. Of course there are memory requirements to consider...
â kludg
Aug 20 at 14:19
Exactly. And I believe the memory requirement would steeply increase as the n increases.
â epr_guy
Aug 20 at 14:24
2
2
I'd suggest a more extreme 'classical' approach: in the end of the day any quantum algorithm is a unitary transformation applied to a n-qubit system; it can be described by $2^ntimes 2^n$ unitary matrix; so we can create a list of known quantum algorithms, described as unitary matrices; and running an algorithm is simply multiplication of the matrix by an input vector, and it would be fast. Of course there are memory requirements to consider...
â kludg
Aug 20 at 14:19
I'd suggest a more extreme 'classical' approach: in the end of the day any quantum algorithm is a unitary transformation applied to a n-qubit system; it can be described by $2^ntimes 2^n$ unitary matrix; so we can create a list of known quantum algorithms, described as unitary matrices; and running an algorithm is simply multiplication of the matrix by an input vector, and it would be fast. Of course there are memory requirements to consider...
â kludg
Aug 20 at 14:19
Exactly. And I believe the memory requirement would steeply increase as the n increases.
â epr_guy
Aug 20 at 14:24
Exactly. And I believe the memory requirement would steeply increase as the n increases.
â epr_guy
Aug 20 at 14:24
add a comment |Â
2 Answers
2
active
oldest
votes
up vote
14
down vote
accepted
Suppose that you have a quantum algorithm with $2^60$ possible inputs. Suppose also that it would take 1 nanosecond to run this on a supercomputer (which is unrealistically optimistic!). The total time required to run through all possible inputs would be 36.5 years.
Clearly it would be much better to just run the instance that you care about, and get the answer in an instant, rather than waiting half a lifetime to pick it from a list. This gets ever more true as we raise the runtime from the unrealistic 1 nanosecond.
why do people work on quantum simulators (say, capable up to 40 qubits); which calculate the result every time?!
Even if you wanted to create a lookup table, you'd still need a simulator like this to create it.
1
The current #1 Top500 supercomputer, at the Oak Ridge, is listed as having 2.3M cores, POWER9 and CUDA Volta (I do not know the breakdown, they probably lump them together in the stats). Assuming the computation is fully parallelizable, which it is, shaves quite a lot from the estimate, down to about 20 minutes. Even multiplying the sim time by 12 puts it at a reasonable time of 4 hours and the energy spending of mere 32 MWÂÂâ§h :)
â kkm
Aug 20 at 19:09
add a comment |Â
up vote
3
down vote
For a specific quantum algorithm that uses 40 qubits, your friend makes a good point. One can just calculate the truth table (one might find this hard, but assume that one can) and use it as reference. Of course this starts to get ridiculous as you increase the number of qubits, not just because of the number of inputs but because computing the outcome of a quantum algorithm could be exponentially harder classically for all we know.
However, being able to simulate a quantum computer (or having an actual quantum computer) is far more useful. By changing what quantum operations one does, one gets different algorithms. The number of functions that one can define on 40 bits of inputs is 2^2^40. Having a single database that gives you instant access to the results of any quantum algorithm is just absurdly infeasible. We want to be able to switch algorithms easily too, and classically we'd want simulators for that.
Can you please tell me how did you calculate $2^2^40$ ?
â epr_guy
Aug 22 at 1:11
1
Each function is defined uniquely by a truth table. For a 40 bit input, the truth table is 2^40 bits long. So the number of truth tables (and hence the number of functions) is the number of bitstrings of length 2^40, which is 2^2^40.
â SuhailSherif
Aug 23 at 2:39
add a comment |Â
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
14
down vote
accepted
Suppose that you have a quantum algorithm with $2^60$ possible inputs. Suppose also that it would take 1 nanosecond to run this on a supercomputer (which is unrealistically optimistic!). The total time required to run through all possible inputs would be 36.5 years.
Clearly it would be much better to just run the instance that you care about, and get the answer in an instant, rather than waiting half a lifetime to pick it from a list. This gets ever more true as we raise the runtime from the unrealistic 1 nanosecond.
why do people work on quantum simulators (say, capable up to 40 qubits); which calculate the result every time?!
Even if you wanted to create a lookup table, you'd still need a simulator like this to create it.
1
The current #1 Top500 supercomputer, at the Oak Ridge, is listed as having 2.3M cores, POWER9 and CUDA Volta (I do not know the breakdown, they probably lump them together in the stats). Assuming the computation is fully parallelizable, which it is, shaves quite a lot from the estimate, down to about 20 minutes. Even multiplying the sim time by 12 puts it at a reasonable time of 4 hours and the energy spending of mere 32 MWÂÂâ§h :)
â kkm
Aug 20 at 19:09
add a comment |Â
up vote
14
down vote
accepted
Suppose that you have a quantum algorithm with $2^60$ possible inputs. Suppose also that it would take 1 nanosecond to run this on a supercomputer (which is unrealistically optimistic!). The total time required to run through all possible inputs would be 36.5 years.
Clearly it would be much better to just run the instance that you care about, and get the answer in an instant, rather than waiting half a lifetime to pick it from a list. This gets ever more true as we raise the runtime from the unrealistic 1 nanosecond.
why do people work on quantum simulators (say, capable up to 40 qubits); which calculate the result every time?!
Even if you wanted to create a lookup table, you'd still need a simulator like this to create it.
1
The current #1 Top500 supercomputer, at the Oak Ridge, is listed as having 2.3M cores, POWER9 and CUDA Volta (I do not know the breakdown, they probably lump them together in the stats). Assuming the computation is fully parallelizable, which it is, shaves quite a lot from the estimate, down to about 20 minutes. Even multiplying the sim time by 12 puts it at a reasonable time of 4 hours and the energy spending of mere 32 MWÂÂâ§h :)
â kkm
Aug 20 at 19:09
add a comment |Â
up vote
14
down vote
accepted
up vote
14
down vote
accepted
Suppose that you have a quantum algorithm with $2^60$ possible inputs. Suppose also that it would take 1 nanosecond to run this on a supercomputer (which is unrealistically optimistic!). The total time required to run through all possible inputs would be 36.5 years.
Clearly it would be much better to just run the instance that you care about, and get the answer in an instant, rather than waiting half a lifetime to pick it from a list. This gets ever more true as we raise the runtime from the unrealistic 1 nanosecond.
why do people work on quantum simulators (say, capable up to 40 qubits); which calculate the result every time?!
Even if you wanted to create a lookup table, you'd still need a simulator like this to create it.
Suppose that you have a quantum algorithm with $2^60$ possible inputs. Suppose also that it would take 1 nanosecond to run this on a supercomputer (which is unrealistically optimistic!). The total time required to run through all possible inputs would be 36.5 years.
Clearly it would be much better to just run the instance that you care about, and get the answer in an instant, rather than waiting half a lifetime to pick it from a list. This gets ever more true as we raise the runtime from the unrealistic 1 nanosecond.
why do people work on quantum simulators (say, capable up to 40 qubits); which calculate the result every time?!
Even if you wanted to create a lookup table, you'd still need a simulator like this to create it.
answered Aug 20 at 11:33
James Woottonâ¦
5,1021737
5,1021737
1
The current #1 Top500 supercomputer, at the Oak Ridge, is listed as having 2.3M cores, POWER9 and CUDA Volta (I do not know the breakdown, they probably lump them together in the stats). Assuming the computation is fully parallelizable, which it is, shaves quite a lot from the estimate, down to about 20 minutes. Even multiplying the sim time by 12 puts it at a reasonable time of 4 hours and the energy spending of mere 32 MWÂÂâ§h :)
â kkm
Aug 20 at 19:09
add a comment |Â
1
The current #1 Top500 supercomputer, at the Oak Ridge, is listed as having 2.3M cores, POWER9 and CUDA Volta (I do not know the breakdown, they probably lump them together in the stats). Assuming the computation is fully parallelizable, which it is, shaves quite a lot from the estimate, down to about 20 minutes. Even multiplying the sim time by 12 puts it at a reasonable time of 4 hours and the energy spending of mere 32 MWÂÂâ§h :)
â kkm
Aug 20 at 19:09
1
1
The current #1 Top500 supercomputer, at the Oak Ridge, is listed as having 2.3M cores, POWER9 and CUDA Volta (I do not know the breakdown, they probably lump them together in the stats). Assuming the computation is fully parallelizable, which it is, shaves quite a lot from the estimate, down to about 20 minutes. Even multiplying the sim time by 12 puts it at a reasonable time of 4 hours and the energy spending of mere 32 MWÂÂâ§h :)
â kkm
Aug 20 at 19:09
The current #1 Top500 supercomputer, at the Oak Ridge, is listed as having 2.3M cores, POWER9 and CUDA Volta (I do not know the breakdown, they probably lump them together in the stats). Assuming the computation is fully parallelizable, which it is, shaves quite a lot from the estimate, down to about 20 minutes. Even multiplying the sim time by 12 puts it at a reasonable time of 4 hours and the energy spending of mere 32 MWÂÂâ§h :)
â kkm
Aug 20 at 19:09
add a comment |Â
up vote
3
down vote
For a specific quantum algorithm that uses 40 qubits, your friend makes a good point. One can just calculate the truth table (one might find this hard, but assume that one can) and use it as reference. Of course this starts to get ridiculous as you increase the number of qubits, not just because of the number of inputs but because computing the outcome of a quantum algorithm could be exponentially harder classically for all we know.
However, being able to simulate a quantum computer (or having an actual quantum computer) is far more useful. By changing what quantum operations one does, one gets different algorithms. The number of functions that one can define on 40 bits of inputs is 2^2^40. Having a single database that gives you instant access to the results of any quantum algorithm is just absurdly infeasible. We want to be able to switch algorithms easily too, and classically we'd want simulators for that.
Can you please tell me how did you calculate $2^2^40$ ?
â epr_guy
Aug 22 at 1:11
1
Each function is defined uniquely by a truth table. For a 40 bit input, the truth table is 2^40 bits long. So the number of truth tables (and hence the number of functions) is the number of bitstrings of length 2^40, which is 2^2^40.
â SuhailSherif
Aug 23 at 2:39
add a comment |Â
up vote
3
down vote
For a specific quantum algorithm that uses 40 qubits, your friend makes a good point. One can just calculate the truth table (one might find this hard, but assume that one can) and use it as reference. Of course this starts to get ridiculous as you increase the number of qubits, not just because of the number of inputs but because computing the outcome of a quantum algorithm could be exponentially harder classically for all we know.
However, being able to simulate a quantum computer (or having an actual quantum computer) is far more useful. By changing what quantum operations one does, one gets different algorithms. The number of functions that one can define on 40 bits of inputs is 2^2^40. Having a single database that gives you instant access to the results of any quantum algorithm is just absurdly infeasible. We want to be able to switch algorithms easily too, and classically we'd want simulators for that.
Can you please tell me how did you calculate $2^2^40$ ?
â epr_guy
Aug 22 at 1:11
1
Each function is defined uniquely by a truth table. For a 40 bit input, the truth table is 2^40 bits long. So the number of truth tables (and hence the number of functions) is the number of bitstrings of length 2^40, which is 2^2^40.
â SuhailSherif
Aug 23 at 2:39
add a comment |Â
up vote
3
down vote
up vote
3
down vote
For a specific quantum algorithm that uses 40 qubits, your friend makes a good point. One can just calculate the truth table (one might find this hard, but assume that one can) and use it as reference. Of course this starts to get ridiculous as you increase the number of qubits, not just because of the number of inputs but because computing the outcome of a quantum algorithm could be exponentially harder classically for all we know.
However, being able to simulate a quantum computer (or having an actual quantum computer) is far more useful. By changing what quantum operations one does, one gets different algorithms. The number of functions that one can define on 40 bits of inputs is 2^2^40. Having a single database that gives you instant access to the results of any quantum algorithm is just absurdly infeasible. We want to be able to switch algorithms easily too, and classically we'd want simulators for that.
For a specific quantum algorithm that uses 40 qubits, your friend makes a good point. One can just calculate the truth table (one might find this hard, but assume that one can) and use it as reference. Of course this starts to get ridiculous as you increase the number of qubits, not just because of the number of inputs but because computing the outcome of a quantum algorithm could be exponentially harder classically for all we know.
However, being able to simulate a quantum computer (or having an actual quantum computer) is far more useful. By changing what quantum operations one does, one gets different algorithms. The number of functions that one can define on 40 bits of inputs is 2^2^40. Having a single database that gives you instant access to the results of any quantum algorithm is just absurdly infeasible. We want to be able to switch algorithms easily too, and classically we'd want simulators for that.
answered Aug 21 at 11:05
SuhailSherif
311
311
Can you please tell me how did you calculate $2^2^40$ ?
â epr_guy
Aug 22 at 1:11
1
Each function is defined uniquely by a truth table. For a 40 bit input, the truth table is 2^40 bits long. So the number of truth tables (and hence the number of functions) is the number of bitstrings of length 2^40, which is 2^2^40.
â SuhailSherif
Aug 23 at 2:39
add a comment |Â
Can you please tell me how did you calculate $2^2^40$ ?
â epr_guy
Aug 22 at 1:11
1
Each function is defined uniquely by a truth table. For a 40 bit input, the truth table is 2^40 bits long. So the number of truth tables (and hence the number of functions) is the number of bitstrings of length 2^40, which is 2^2^40.
â SuhailSherif
Aug 23 at 2:39
Can you please tell me how did you calculate $2^2^40$ ?
â epr_guy
Aug 22 at 1:11
Can you please tell me how did you calculate $2^2^40$ ?
â epr_guy
Aug 22 at 1:11
1
1
Each function is defined uniquely by a truth table. For a 40 bit input, the truth table is 2^40 bits long. So the number of truth tables (and hence the number of functions) is the number of bitstrings of length 2^40, which is 2^2^40.
â SuhailSherif
Aug 23 at 2:39
Each function is defined uniquely by a truth table. For a 40 bit input, the truth table is 2^40 bits long. So the number of truth tables (and hence the number of functions) is the number of bitstrings of length 2^40, which is 2^2^40.
â SuhailSherif
Aug 23 at 2:39
add a comment |Â
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fquantumcomputing.stackexchange.com%2fquestions%2f4057%2fclassical-memory-enough-to-store-states-up-to-40-qubits-quantum-system%23new-answer', 'question_page');
);
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
2
I'd suggest a more extreme 'classical' approach: in the end of the day any quantum algorithm is a unitary transformation applied to a n-qubit system; it can be described by $2^ntimes 2^n$ unitary matrix; so we can create a list of known quantum algorithms, described as unitary matrices; and running an algorithm is simply multiplication of the matrix by an input vector, and it would be fast. Of course there are memory requirements to consider...
â kludg
Aug 20 at 14:19
Exactly. And I believe the memory requirement would steeply increase as the n increases.
â epr_guy
Aug 20 at 14:24