Classical Memory enough to store states up to 40 qubits quantum system?

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty margin-bottom:0;







up vote
10
down vote

favorite
2












As a part of a discussion with my 'classical' friend, he insisted that making a a state machine for calculating the outcome of quantum computer is possible; so, simply calculate the outcomes of (known) algorithms on supercomputers and store their results in a Look-up table. (Something like storing the truth table).


So, why do people work on quantum simulators (say, capable up to 40 qubits); which calculate the result every time?! Simply (hypothetically) use the supercomputers of the world (say capable up to 60 qubits); calculate the result for $2^60$ input cases, store their result and use it as reference? How can I convince him it's not possible?

Note: this is for known quantum algorithms and their known circuit implementations.







share|improve this question


















  • 2




    I'd suggest a more extreme 'classical' approach: in the end of the day any quantum algorithm is a unitary transformation applied to a n-qubit system; it can be described by $2^ntimes 2^n$ unitary matrix; so we can create a list of known quantum algorithms, described as unitary matrices; and running an algorithm is simply multiplication of the matrix by an input vector, and it would be fast. Of course there are memory requirements to consider...
    – kludg
    Aug 20 at 14:19











  • Exactly. And I believe the memory requirement would steeply increase as the n increases.
    – epr_guy
    Aug 20 at 14:24
















up vote
10
down vote

favorite
2












As a part of a discussion with my 'classical' friend, he insisted that making a a state machine for calculating the outcome of quantum computer is possible; so, simply calculate the outcomes of (known) algorithms on supercomputers and store their results in a Look-up table. (Something like storing the truth table).


So, why do people work on quantum simulators (say, capable up to 40 qubits); which calculate the result every time?! Simply (hypothetically) use the supercomputers of the world (say capable up to 60 qubits); calculate the result for $2^60$ input cases, store their result and use it as reference? How can I convince him it's not possible?

Note: this is for known quantum algorithms and their known circuit implementations.







share|improve this question


















  • 2




    I'd suggest a more extreme 'classical' approach: in the end of the day any quantum algorithm is a unitary transformation applied to a n-qubit system; it can be described by $2^ntimes 2^n$ unitary matrix; so we can create a list of known quantum algorithms, described as unitary matrices; and running an algorithm is simply multiplication of the matrix by an input vector, and it would be fast. Of course there are memory requirements to consider...
    – kludg
    Aug 20 at 14:19











  • Exactly. And I believe the memory requirement would steeply increase as the n increases.
    – epr_guy
    Aug 20 at 14:24












up vote
10
down vote

favorite
2









up vote
10
down vote

favorite
2






2





As a part of a discussion with my 'classical' friend, he insisted that making a a state machine for calculating the outcome of quantum computer is possible; so, simply calculate the outcomes of (known) algorithms on supercomputers and store their results in a Look-up table. (Something like storing the truth table).


So, why do people work on quantum simulators (say, capable up to 40 qubits); which calculate the result every time?! Simply (hypothetically) use the supercomputers of the world (say capable up to 60 qubits); calculate the result for $2^60$ input cases, store their result and use it as reference? How can I convince him it's not possible?

Note: this is for known quantum algorithms and their known circuit implementations.







share|improve this question














As a part of a discussion with my 'classical' friend, he insisted that making a a state machine for calculating the outcome of quantum computer is possible; so, simply calculate the outcomes of (known) algorithms on supercomputers and store their results in a Look-up table. (Something like storing the truth table).


So, why do people work on quantum simulators (say, capable up to 40 qubits); which calculate the result every time?! Simply (hypothetically) use the supercomputers of the world (say capable up to 60 qubits); calculate the result for $2^60$ input cases, store their result and use it as reference? How can I convince him it's not possible?

Note: this is for known quantum algorithms and their known circuit implementations.









share|improve this question













share|improve this question




share|improve this question








edited Aug 20 at 14:25

























asked Aug 20 at 10:28









epr_guy

1139




1139







  • 2




    I'd suggest a more extreme 'classical' approach: in the end of the day any quantum algorithm is a unitary transformation applied to a n-qubit system; it can be described by $2^ntimes 2^n$ unitary matrix; so we can create a list of known quantum algorithms, described as unitary matrices; and running an algorithm is simply multiplication of the matrix by an input vector, and it would be fast. Of course there are memory requirements to consider...
    – kludg
    Aug 20 at 14:19











  • Exactly. And I believe the memory requirement would steeply increase as the n increases.
    – epr_guy
    Aug 20 at 14:24












  • 2




    I'd suggest a more extreme 'classical' approach: in the end of the day any quantum algorithm is a unitary transformation applied to a n-qubit system; it can be described by $2^ntimes 2^n$ unitary matrix; so we can create a list of known quantum algorithms, described as unitary matrices; and running an algorithm is simply multiplication of the matrix by an input vector, and it would be fast. Of course there are memory requirements to consider...
    – kludg
    Aug 20 at 14:19











  • Exactly. And I believe the memory requirement would steeply increase as the n increases.
    – epr_guy
    Aug 20 at 14:24







2




2




I'd suggest a more extreme 'classical' approach: in the end of the day any quantum algorithm is a unitary transformation applied to a n-qubit system; it can be described by $2^ntimes 2^n$ unitary matrix; so we can create a list of known quantum algorithms, described as unitary matrices; and running an algorithm is simply multiplication of the matrix by an input vector, and it would be fast. Of course there are memory requirements to consider...
– kludg
Aug 20 at 14:19





I'd suggest a more extreme 'classical' approach: in the end of the day any quantum algorithm is a unitary transformation applied to a n-qubit system; it can be described by $2^ntimes 2^n$ unitary matrix; so we can create a list of known quantum algorithms, described as unitary matrices; and running an algorithm is simply multiplication of the matrix by an input vector, and it would be fast. Of course there are memory requirements to consider...
– kludg
Aug 20 at 14:19













Exactly. And I believe the memory requirement would steeply increase as the n increases.
– epr_guy
Aug 20 at 14:24




Exactly. And I believe the memory requirement would steeply increase as the n increases.
– epr_guy
Aug 20 at 14:24










2 Answers
2






active

oldest

votes

















up vote
14
down vote



accepted










Suppose that you have a quantum algorithm with $2^60$ possible inputs. Suppose also that it would take 1 nanosecond to run this on a supercomputer (which is unrealistically optimistic!). The total time required to run through all possible inputs would be 36.5 years.



Clearly it would be much better to just run the instance that you care about, and get the answer in an instant, rather than waiting half a lifetime to pick it from a list. This gets ever more true as we raise the runtime from the unrealistic 1 nanosecond.




why do people work on quantum simulators (say, capable up to 40 qubits); which calculate the result every time?!




Even if you wanted to create a lookup table, you'd still need a simulator like this to create it.






share|improve this answer
















  • 1




    The current #1 Top500 supercomputer, at the Oak Ridge, is listed as having 2.3M cores, POWER9 and CUDA Volta (I do not know the breakdown, they probably lump them together in the stats). Assuming the computation is fully parallelizable, which it is, shaves quite a lot from the estimate, down to about 20 minutes. Even multiplying the sim time by 12 puts it at a reasonable time of 4 hours and the energy spending of mere 32 MW­­â€§h :)
    – kkm
    Aug 20 at 19:09


















up vote
3
down vote













For a specific quantum algorithm that uses 40 qubits, your friend makes a good point. One can just calculate the truth table (one might find this hard, but assume that one can) and use it as reference. Of course this starts to get ridiculous as you increase the number of qubits, not just because of the number of inputs but because computing the outcome of a quantum algorithm could be exponentially harder classically for all we know.



However, being able to simulate a quantum computer (or having an actual quantum computer) is far more useful. By changing what quantum operations one does, one gets different algorithms. The number of functions that one can define on 40 bits of inputs is 2^2^40. Having a single database that gives you instant access to the results of any quantum algorithm is just absurdly infeasible. We want to be able to switch algorithms easily too, and classically we'd want simulators for that.






share|improve this answer




















  • Can you please tell me how did you calculate $2^2^40$ ?
    – epr_guy
    Aug 22 at 1:11






  • 1




    Each function is defined uniquely by a truth table. For a 40 bit input, the truth table is 2^40 bits long. So the number of truth tables (and hence the number of functions) is the number of bitstrings of length 2^40, which is 2^2^40.
    – SuhailSherif
    Aug 23 at 2:39










Your Answer




StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "694"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
convertImagesToLinks: false,
noModals: false,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);








 

draft saved


draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fquantumcomputing.stackexchange.com%2fquestions%2f4057%2fclassical-memory-enough-to-store-states-up-to-40-qubits-quantum-system%23new-answer', 'question_page');

);

Post as a guest






























2 Answers
2






active

oldest

votes








2 Answers
2






active

oldest

votes









active

oldest

votes






active

oldest

votes








up vote
14
down vote



accepted










Suppose that you have a quantum algorithm with $2^60$ possible inputs. Suppose also that it would take 1 nanosecond to run this on a supercomputer (which is unrealistically optimistic!). The total time required to run through all possible inputs would be 36.5 years.



Clearly it would be much better to just run the instance that you care about, and get the answer in an instant, rather than waiting half a lifetime to pick it from a list. This gets ever more true as we raise the runtime from the unrealistic 1 nanosecond.




why do people work on quantum simulators (say, capable up to 40 qubits); which calculate the result every time?!




Even if you wanted to create a lookup table, you'd still need a simulator like this to create it.






share|improve this answer
















  • 1




    The current #1 Top500 supercomputer, at the Oak Ridge, is listed as having 2.3M cores, POWER9 and CUDA Volta (I do not know the breakdown, they probably lump them together in the stats). Assuming the computation is fully parallelizable, which it is, shaves quite a lot from the estimate, down to about 20 minutes. Even multiplying the sim time by 12 puts it at a reasonable time of 4 hours and the energy spending of mere 32 MW­­â€§h :)
    – kkm
    Aug 20 at 19:09















up vote
14
down vote



accepted










Suppose that you have a quantum algorithm with $2^60$ possible inputs. Suppose also that it would take 1 nanosecond to run this on a supercomputer (which is unrealistically optimistic!). The total time required to run through all possible inputs would be 36.5 years.



Clearly it would be much better to just run the instance that you care about, and get the answer in an instant, rather than waiting half a lifetime to pick it from a list. This gets ever more true as we raise the runtime from the unrealistic 1 nanosecond.




why do people work on quantum simulators (say, capable up to 40 qubits); which calculate the result every time?!




Even if you wanted to create a lookup table, you'd still need a simulator like this to create it.






share|improve this answer
















  • 1




    The current #1 Top500 supercomputer, at the Oak Ridge, is listed as having 2.3M cores, POWER9 and CUDA Volta (I do not know the breakdown, they probably lump them together in the stats). Assuming the computation is fully parallelizable, which it is, shaves quite a lot from the estimate, down to about 20 minutes. Even multiplying the sim time by 12 puts it at a reasonable time of 4 hours and the energy spending of mere 32 MW­­â€§h :)
    – kkm
    Aug 20 at 19:09













up vote
14
down vote



accepted







up vote
14
down vote



accepted






Suppose that you have a quantum algorithm with $2^60$ possible inputs. Suppose also that it would take 1 nanosecond to run this on a supercomputer (which is unrealistically optimistic!). The total time required to run through all possible inputs would be 36.5 years.



Clearly it would be much better to just run the instance that you care about, and get the answer in an instant, rather than waiting half a lifetime to pick it from a list. This gets ever more true as we raise the runtime from the unrealistic 1 nanosecond.




why do people work on quantum simulators (say, capable up to 40 qubits); which calculate the result every time?!




Even if you wanted to create a lookup table, you'd still need a simulator like this to create it.






share|improve this answer












Suppose that you have a quantum algorithm with $2^60$ possible inputs. Suppose also that it would take 1 nanosecond to run this on a supercomputer (which is unrealistically optimistic!). The total time required to run through all possible inputs would be 36.5 years.



Clearly it would be much better to just run the instance that you care about, and get the answer in an instant, rather than waiting half a lifetime to pick it from a list. This gets ever more true as we raise the runtime from the unrealistic 1 nanosecond.




why do people work on quantum simulators (say, capable up to 40 qubits); which calculate the result every time?!




Even if you wanted to create a lookup table, you'd still need a simulator like this to create it.







share|improve this answer












share|improve this answer



share|improve this answer










answered Aug 20 at 11:33









James Wootton♦

5,1021737




5,1021737







  • 1




    The current #1 Top500 supercomputer, at the Oak Ridge, is listed as having 2.3M cores, POWER9 and CUDA Volta (I do not know the breakdown, they probably lump them together in the stats). Assuming the computation is fully parallelizable, which it is, shaves quite a lot from the estimate, down to about 20 minutes. Even multiplying the sim time by 12 puts it at a reasonable time of 4 hours and the energy spending of mere 32 MW­­â€§h :)
    – kkm
    Aug 20 at 19:09













  • 1




    The current #1 Top500 supercomputer, at the Oak Ridge, is listed as having 2.3M cores, POWER9 and CUDA Volta (I do not know the breakdown, they probably lump them together in the stats). Assuming the computation is fully parallelizable, which it is, shaves quite a lot from the estimate, down to about 20 minutes. Even multiplying the sim time by 12 puts it at a reasonable time of 4 hours and the energy spending of mere 32 MW­­â€§h :)
    – kkm
    Aug 20 at 19:09








1




1




The current #1 Top500 supercomputer, at the Oak Ridge, is listed as having 2.3M cores, POWER9 and CUDA Volta (I do not know the breakdown, they probably lump them together in the stats). Assuming the computation is fully parallelizable, which it is, shaves quite a lot from the estimate, down to about 20 minutes. Even multiplying the sim time by 12 puts it at a reasonable time of 4 hours and the energy spending of mere 32 MW­­â€§h :)
– kkm
Aug 20 at 19:09





The current #1 Top500 supercomputer, at the Oak Ridge, is listed as having 2.3M cores, POWER9 and CUDA Volta (I do not know the breakdown, they probably lump them together in the stats). Assuming the computation is fully parallelizable, which it is, shaves quite a lot from the estimate, down to about 20 minutes. Even multiplying the sim time by 12 puts it at a reasonable time of 4 hours and the energy spending of mere 32 MW­­â€§h :)
– kkm
Aug 20 at 19:09













up vote
3
down vote













For a specific quantum algorithm that uses 40 qubits, your friend makes a good point. One can just calculate the truth table (one might find this hard, but assume that one can) and use it as reference. Of course this starts to get ridiculous as you increase the number of qubits, not just because of the number of inputs but because computing the outcome of a quantum algorithm could be exponentially harder classically for all we know.



However, being able to simulate a quantum computer (or having an actual quantum computer) is far more useful. By changing what quantum operations one does, one gets different algorithms. The number of functions that one can define on 40 bits of inputs is 2^2^40. Having a single database that gives you instant access to the results of any quantum algorithm is just absurdly infeasible. We want to be able to switch algorithms easily too, and classically we'd want simulators for that.






share|improve this answer




















  • Can you please tell me how did you calculate $2^2^40$ ?
    – epr_guy
    Aug 22 at 1:11






  • 1




    Each function is defined uniquely by a truth table. For a 40 bit input, the truth table is 2^40 bits long. So the number of truth tables (and hence the number of functions) is the number of bitstrings of length 2^40, which is 2^2^40.
    – SuhailSherif
    Aug 23 at 2:39














up vote
3
down vote













For a specific quantum algorithm that uses 40 qubits, your friend makes a good point. One can just calculate the truth table (one might find this hard, but assume that one can) and use it as reference. Of course this starts to get ridiculous as you increase the number of qubits, not just because of the number of inputs but because computing the outcome of a quantum algorithm could be exponentially harder classically for all we know.



However, being able to simulate a quantum computer (or having an actual quantum computer) is far more useful. By changing what quantum operations one does, one gets different algorithms. The number of functions that one can define on 40 bits of inputs is 2^2^40. Having a single database that gives you instant access to the results of any quantum algorithm is just absurdly infeasible. We want to be able to switch algorithms easily too, and classically we'd want simulators for that.






share|improve this answer




















  • Can you please tell me how did you calculate $2^2^40$ ?
    – epr_guy
    Aug 22 at 1:11






  • 1




    Each function is defined uniquely by a truth table. For a 40 bit input, the truth table is 2^40 bits long. So the number of truth tables (and hence the number of functions) is the number of bitstrings of length 2^40, which is 2^2^40.
    – SuhailSherif
    Aug 23 at 2:39












up vote
3
down vote










up vote
3
down vote









For a specific quantum algorithm that uses 40 qubits, your friend makes a good point. One can just calculate the truth table (one might find this hard, but assume that one can) and use it as reference. Of course this starts to get ridiculous as you increase the number of qubits, not just because of the number of inputs but because computing the outcome of a quantum algorithm could be exponentially harder classically for all we know.



However, being able to simulate a quantum computer (or having an actual quantum computer) is far more useful. By changing what quantum operations one does, one gets different algorithms. The number of functions that one can define on 40 bits of inputs is 2^2^40. Having a single database that gives you instant access to the results of any quantum algorithm is just absurdly infeasible. We want to be able to switch algorithms easily too, and classically we'd want simulators for that.






share|improve this answer












For a specific quantum algorithm that uses 40 qubits, your friend makes a good point. One can just calculate the truth table (one might find this hard, but assume that one can) and use it as reference. Of course this starts to get ridiculous as you increase the number of qubits, not just because of the number of inputs but because computing the outcome of a quantum algorithm could be exponentially harder classically for all we know.



However, being able to simulate a quantum computer (or having an actual quantum computer) is far more useful. By changing what quantum operations one does, one gets different algorithms. The number of functions that one can define on 40 bits of inputs is 2^2^40. Having a single database that gives you instant access to the results of any quantum algorithm is just absurdly infeasible. We want to be able to switch algorithms easily too, and classically we'd want simulators for that.







share|improve this answer












share|improve this answer



share|improve this answer










answered Aug 21 at 11:05









SuhailSherif

311




311











  • Can you please tell me how did you calculate $2^2^40$ ?
    – epr_guy
    Aug 22 at 1:11






  • 1




    Each function is defined uniquely by a truth table. For a 40 bit input, the truth table is 2^40 bits long. So the number of truth tables (and hence the number of functions) is the number of bitstrings of length 2^40, which is 2^2^40.
    – SuhailSherif
    Aug 23 at 2:39
















  • Can you please tell me how did you calculate $2^2^40$ ?
    – epr_guy
    Aug 22 at 1:11






  • 1




    Each function is defined uniquely by a truth table. For a 40 bit input, the truth table is 2^40 bits long. So the number of truth tables (and hence the number of functions) is the number of bitstrings of length 2^40, which is 2^2^40.
    – SuhailSherif
    Aug 23 at 2:39















Can you please tell me how did you calculate $2^2^40$ ?
– epr_guy
Aug 22 at 1:11




Can you please tell me how did you calculate $2^2^40$ ?
– epr_guy
Aug 22 at 1:11




1




1




Each function is defined uniquely by a truth table. For a 40 bit input, the truth table is 2^40 bits long. So the number of truth tables (and hence the number of functions) is the number of bitstrings of length 2^40, which is 2^2^40.
– SuhailSherif
Aug 23 at 2:39




Each function is defined uniquely by a truth table. For a 40 bit input, the truth table is 2^40 bits long. So the number of truth tables (and hence the number of functions) is the number of bitstrings of length 2^40, which is 2^2^40.
– SuhailSherif
Aug 23 at 2:39












 

draft saved


draft discarded


























 


draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fquantumcomputing.stackexchange.com%2fquestions%2f4057%2fclassical-memory-enough-to-store-states-up-to-40-qubits-quantum-system%23new-answer', 'question_page');

);

Post as a guest













































































這個網誌中的熱門文章

How to combine Bézier curves to a surface?

Carbon dioxide

Why am i infinitely getting the same tweet with the Twitter Search API?