Jump to content
House Price Crash Forum

Archived

This topic is now archived and is closed to further replies.

Bloo Loo

Does Quantum Computing Actually Work?

Recommended Posts

I googled this and you get myriad pages of what quantum computers are supposed to do, Qubits being the quantum bit on which we can encode much more info, apparently

I read that whatever algorythm you put in, it produces all anwers at once.good for decryption perhaps.

My problem is, does this actually happen, and how do you know what the right answer is?

Share this post


Link to post
Share on other sites

I googled this and you get myriad pages of what quantum computers are supposed to do, Qubits being the quantum bit on which we can encode much more info, apparently

I read that whatever algorythm you put in, it produces all anwers at once.good for decryption perhaps.

My problem is, does this actually happen, and how do you know what the right answer is?

Well they keep the researches in gainful employment, so its a win from them.

Share this post


Link to post
Share on other sites

Seen a documentary about Googles? one huge great thing cooled by liquid nitrogen the gist of it was passwords/security codes will be useless when/if this tec becomes commercially viable

so you say it does work, but doesnt...thats quantum computing explained perfectly.

Share this post


Link to post
Share on other sites

Looks like tulip-mania to me - at least I've not seen anything to suggest otherwise.

AFAIK, no "quantum computer" has yet computed anything.

So you claim you have enough expertise to tell whether something has been computed in a quantum way or not ?

Wikipedia is the friend here I think for the non expert :

https://en.wikipedia.org/wiki/Quantum_computing

There is general acceptance that quantum algorithms have been used to practically perform quantum algorithms (in particular Shors algorithm) with small numbers of qbits. The tasks that they have performed are trivial mathematical ones - ie factor 15 with a 7 qubit computer.

The real prize is to have a commercially available quantum computer with an arbitrarily large number of qbits. Currently there are commercial quantum computers available with large numbers of qbits, but there is no agreement as to whether or not these computers are working as "full" quantum computers and indeed achieve the full theoretical improvement over the classical calculation.

Progress appears to be slow, but steady. "Breakthroughs" are announced all the time, any one of which may turn into a game changer, but establishing the potential of the breakthrough technologies is non trivial.

Share this post


Link to post
Share on other sites

QCs can work but you have to keep the quantum coherence (whatever that means), which effectively means that it becomes exponentially more difficult to build a quantum computer the more qbits you have.

There is a 'theory' which says that some things, such as prime factorisation, 'have' to be non-easily computable no matter what mechanism you try to use - and that ultimately coherence problems will prove just as intractable a problem as the problems we get in every other method to try for the difficult computation.

The commercial computers with large numbers of qbits (essentially DWave) don't really do large qbit quantum computing - as far as I can make out the fundamental processor unit in a DWave has 8 qbits fairly strongly coupled together (but they might not actually be qbits, and might not actually be entangled together, because DWave don't like to tell you) - these 8 qbit units are then linked together to make a >1000 qbit computer - but this computer can't do the sort of computations you'd expect of a true >1000 qbit computer, as they're not actually all linked together at a quantum level. This said, the DWave computers do seem to be able to do more than just the sort of computations that a 125 core 8 qbuit quantum computer should be able to make...

Share this post


Link to post
Share on other sites

so you say it does work, but doesnt...thats quantum computing explained perfectly.

I`ve only just sussed out the i phone so would not have a clue if it works ,just that it was huge and i don`t expect it to be available in laptop/PC form any time soon whether it works or not

Share this post


Link to post
Share on other sites

so, how did they read the results?

And were they fast...the wiki articles are light on how long it took.

Share this post


Link to post
Share on other sites

so, how did they read the results?

And were they fast...the wiki articles are light on how long it took.

My expertise is in some aspects of the hardware used to do the calcs rather than the nature of the calcs themselves, so I wouldn't trust my explanations completely.

The benefit is in being able to do multiple calculations simultaneously, or for want of a better description conduct an algorithm in a reduced number of steps. That doesn't mean it would be *faster* than a normal computer. Say for example in a calculation you need to do 50 calculations. In a normal computer each calculation takes say 1 microsecond, so the total calculation takes 50 microseconds. In a quantum computer you might do all 50 calculations at once, but that single step might take 1millisecond. So just because its a quantum calculation doesn't necessary mean it will be faster than a conventional computer, just that it has the potential to be. Not sure but I think the number of parallel calculations scale to some power of the number of qbits, so a small increase in bits leads to a massive increase in the ablitity to perform parallel/simulataneous calculations.

The results are easy to check. I think the practical proof of quantum computing will only come though when the computers are able to vastly outpower current technology. In order to do this they will need significantly larger numbers of qbits than where we are at the moment. If say a quantum computer can crack say RSA encryption far far faster than a normal computer (which it should be able to do with a reasonable number of qubits consistently) then there would be a decent case for supposing it would be a true quantum computer although from a mathematical perspective it wouldn't be an absolute proof. You have to look into the details of the encryption in order to understand this more. There is plenty of stuff on the net available, but on a generalised forum like this you are unlikely to get any posts that are going to significantly further your education.

Share this post


Link to post
Share on other sites

Thanks for the effort put in by previous posters valiantly doing their best to explain this quantum stuff to numpties like me.

Sadly, you've failed...! ;)

Not your fault though guys - I've never understood anything that starts with the word "quantum"...

XYY

Share this post


Link to post
Share on other sites

My expertise is in some aspects of the hardware used to do the calcs rather than the nature of the calcs themselves, so I wouldn't trust my explanations completely.

The benefit is in being able to do multiple calculations simultaneously, or for want of a better description conduct an algorithm in a reduced number of steps. That doesn't mean it would be *faster* than a normal computer. Say for example in a calculation you need to do 50 calculations. In a normal computer each calculation takes say 1 microsecond, so the total calculation takes 50 microseconds. In a quantum computer you might do all 50 calculations at once, but that single step might take 1millisecond. So just because its a quantum calculation doesn't necessary mean it will be faster than a conventional computer, just that it has the potential to be. Not sure but I think the number of parallel calculations scale to some power of the number of qbits, so a small increase in bits leads to a massive increase in the ablitity to perform parallel/simulataneous calculations.

The results are easy to check. I think the practical proof of quantum computing will only come though when the computers are able to vastly outpower current technology. In order to do this they will need significantly larger numbers of qbits than where we are at the moment. If say a quantum computer can crack say RSA encryption far far faster than a normal computer (which it should be able to do with a reasonable number of qubits consistently) then there would be a decent case for supposing it would be a true quantum computer although from a mathematical perspective it wouldn't be an absolute proof. You have to look into the details of the encryption in order to understand this more. There is plenty of stuff on the net available, but on a generalised forum like this you are unlikely to get any posts that are going to significantly further your education.

I have to say, your presentation wasn't particularly convincing.

Share this post


Link to post
Share on other sites

You take years with log tables slide rule to check it...

I can do that! :blink:

Share this post


Link to post
Share on other sites

We have all done that! Now see me after assembly! :huh:

Share this post


Link to post
Share on other sites

Your works sounds a bit of mixed-bag hotty.

If it were a bag of sweets it would be round-the-trees fluke Pascals...

;)

XYY

Share this post


Link to post
Share on other sites

Not the documentary i seen but looks like the same machine

lots of "i may be able to" statements in that Dwave advert...looks to me like a supercooled chip to me...that in itself would speed things up I beleive

Share this post


Link to post
Share on other sites

So the only firm to sell what may be a Quantum computer is a Canadian Company called D-Wave. They have sold systems to people like NASA, Google and probably some other agencies.

As described a lot of the focus on these systems is whether they are actually really behaving in a Quantum way, there are a lot of challenges in assessing this, do remember that if you are producing multiple solutions at once to a problem such as factorisation, you do need to have a meaningful way of extracting the output , storing it and reviewing it. All this is more complex than you might at first think.

Also , do remember that instead of Bit these device have Qubits, and at the moment they have just reached 1000 Qubits, but I am pretty convinced that is just like a single long register, so not all the flexible.

....and as other have mentioned you need tonnes of cooling kit.

Share this post


Link to post
Share on other sites

Also , do remember that instead of Bit these device have Qubits, and at the moment they have just reached 1000 Qubits, but I am pretty convinced that is just like a single long register, so not all the flexible.

I think that maybe in principle a 1000-qubit register could store any(?) subset of the 21000 possible values simultaneously, so that's slightly more versatile than a classical 1000-bit register. I might well be wrong about that though...

Share this post


Link to post
Share on other sites

I think that maybe in principle a 1000-qubit register could store any(?) subset of the 21000 possible values simultaneously, so that's slightly more versatile than a classical 1000-bit register. I might well be wrong about that though...

The Dwave [version of a 1000 qbit processor] behaves more like 125 x 8 qbit registers. And even the 8 qbit registers don't seem to be that coherent - I'd be happy to believe that they exist as 4 x 2 qbit elements, where the qbit pairs are entangled properly with one another, but not that well with the other 3 pairs and not really at all with the other 124 registers. But it is very difficult to get sensible info out of DWave.

You couldn't use a DWave to solve a 1000 bit factorisation problem, although it could possibly simultaneously factorise 125 8 bit numbers (not that great a feat). It does seem to have other applications for solving optimisation problems and in pattern matching where it does seem to offer some speed advantage over conventional computational approaches, even in its current state.

Share this post


Link to post
Share on other sites

I did a dissertation at uni called 'Pseudo Parallelism in Pascal'. Actually, as I had to do a viva I changed it to 'Pseudo Concurrency in Pascal'as it was easier to say. This was before the PC had hit the shops so things were pretty basic, having to book mainframe processing power and all that. Anyway, that's by-the-by....the thing is multi processing and the concurrent processing of different processes wasn't available yet. However, the mathematical possibilities were being keenly explored. Initially it involved running multiple processes on a computer as an operating system does using a single processor, like printing, calculating, storing data etc. Basically you had to share out the processor as a scarce resource to multiple processes using different queueing methodologies (FIFO, FILO etc.). But you can also do things with complex problems by breaking them down into their components and sending each component off for processing separately before coming back together for the next step. Think of brackets in an equation - the contents of each bracket can be sent off separately. Now, obviously parcelling up time on a single processor hastens things not one jot...but if you had several linked processors, then the possibilities are endless.

The Pseudo Parallelism bit in my thesis was about breaking into the Pascal compiler so I could mimic multi processing on a single processor and create crude operating systems, email-like send/receive messages etc.

Parallel processing is great until you see CXPACKET Waits or their equivalent forms of Parallel skew.

The same problem afflicts MapReduce processes running in things like Hadoop. In theory it should be a quicker way of processing queries against large volumes of data by breaking them up and running them against multiple distributed processors across a clustered HDFS. The reality is that like parallel processing on a conventional RDBMS it is as quick as the slowest process which sometimes is not as fast as running on a single processor accessing a single dedicated file system. Worse sometimes the batch processes submitted just die and you never get a result. Simply throwing processors at a problem does not always yield the benefits some expect.

For most conventional business computing problems outside of academia the biggest issue is still the Von Neumann bottleneck. The fact is processors are a lot faster than the speed at which computers can shift data between disk, memory and CPU so a lot of time is simply spent waiting for things to happen. I am not sure Quantum Computing is really addressing that problem.

Share this post


Link to post
Share on other sites

  • Recently Browsing   0 members

    No registered users viewing this page.

  • The Prime Minister stated that there were three Brexit options available to the UK:   36 members have voted

    1. 1. Which of the Prime Minister's options would you choose?


      • Leave with the negotiated deal
      • Remain
      • Leave with no deal

    Please sign in or register to vote in this poll. View topic


×

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.