IBM quantum computing

Seen something interesting in the news or on the intertubes? Discuss it here.

Moderators: Zamfir, Hawknc, Moderators General, Prelates

assersamak
Posts: 1
Joined: Tue Feb 28, 2012 7:20 am UTC

IBM quantum computing

Postby assersamak » Tue Feb 28, 2012 7:25 am UTC

I just read on the news that IBM announced it's closer than ever to build the first quantum computer .. then i just couldn't stand but think of a possible xkcd commit about it .. here's what I thought of !

- I don't get it !!
IBM spends all this money on security research to come up with products like SFED and cryptolite, and at the same time it says it's really close to quantum computing ?!

- SO ?

- Basically they're selling us software that's breakable by the very next computer they're making .. isn't that hypocritical?

- I bet the first demo they'll make is trying to break their own encryption .. either way it goes, it a win for them.

Derek
Posts: 2181
Joined: Wed Aug 18, 2010 4:15 am UTC

Re: IBM quantum computing

Postby Derek » Tue Feb 28, 2012 9:57 am UTC

Would you rather your locksmith be on the cutting edge of lock picking research, or know nothing at all about lock picking?

User avatar
skeptical scientist
closed-minded spiritualist
Posts: 6142
Joined: Tue Nov 28, 2006 6:09 am UTC
Location: San Francisco

Re: IBM quantum computing

Postby skeptical scientist » Tue Feb 28, 2012 11:16 am UTC

Shame on the OP for posting a news thread without a link to the article. Here is a decent NYTimes article I found.*

To answer the OPs confusion about IBM's simultaneous pursuit of both quantum computing and classical cryptography, first of all, quantum computing is still many years away. According to an IBM researcher interviewed in the article, quantum computers might only be 15 years away. Since he's a biased source, that can probably be considered an unreasonably realistic assumption, so there's no reason for IBM not to pursue short-term technology which would be out-of-date by then anyways.

Also, it's important to remember that quantum computing doesn't necessarily break all current crypto algorithms. Using known quantum computer algorithms, it only breaks crypto which is based on prime factorization being hard, and forces other crypto to use longer keys. So public-key cryptography as we know it will be in serious trouble, but private-key cryptography (like what you use to encrypt your hard drive) won't be significantly affected. See Wikipedia:Key size.

*Just for the humor value, here is the first article I read, which was terrible. My favorite part: "For example, today's best multi-core processors can encrypt or decrypt a 150-digit number, but if you were interested in decrypting a 1,000-digit number, it would take roughly all the computational resources in the world to do it, Ketchen said."
I'm looking forward to the day when the SNES emulator on my computer works by emulating the elementary particles in an actual, physical box with Nintendo stamped on the side.

"With math, all things are possible." —Rebecca Watson

Ghostbear
Posts: 1764
Joined: Sat Apr 26, 2008 10:06 pm UTC

Re: IBM quantum computing

Postby Ghostbear » Tue Feb 28, 2012 11:44 am UTC

skeptical scientist wrote:To answer the OPs confusion about IBM's simultaneous pursuit of both quantum computing and classical cryptography, first of all, quantum computing is still many years away. According to an IBM researcher interviewed in the article, quantum computers might only be 15 years away. Since he's a biased source, that can probably be considered an unreasonably realistic assumption, so there's no reason for IBM not to pursue short-term technology which would be out-of-date by then anyways.

I wonder what exactly they mean by 15 years away. It might not be meant in the sense that we would inherently interpret. Instead of meaning "in 15 years, quantum computing will be at such a stage that it is useful" it might just mean "in 15 years, quantum computing will be possible"- there could easily be another 10 year or longer gap from the second scenario to the first one (if I recall correctly, a lot of the "HD" display tech was first created in the 1970's, for instance). I don't think it's that unlikely of an error either- it seems pretty common for the press to misinterpret something a scientist says, either out of ignorance or sensationalism.

I'm also curious as to another part of the article- a qubit lasting 100 μs not being around long enough for computation would indicate a rather slow clockspeed, only in the KHz range (10 KHz if there was no delay involved with any other functions related to reading/writing/etc. the qubit). I don't know much about quantum computing, but are low clockspeeds a fundamental characteristic of them, or is it just part of the "we're still working on it" factor?

skeptical scientist wrote:*Just for the humor value, here is the first article I read, which was terrible. My favorite part: "For example, today's best multi-core processors can encrypt or decrypt a 150-digit number, but if you were interested in decrypting a 1,000-digit number, it would take roughly all the computational resources in the world to do it, Ketchen said."

I think my brain just melted from the stupidity of it- it makes no sense! There's no reference to time or.. anything. Even for a dumb statement, I think it's bad. Sadly though, the NYT made a similar error:
New York Times wrote:For certain problems like searching databases or factoring very large numbers — the basis of today’s encryption techniques — quantum computers could produce an answer in days or maybe even seconds, whereas the fastest conventional computer would take longer than 13.7 billion years.

Would take 13.7 billion years to do.. what? Factor a very large number? What size of very large number? Which fastest conventional computer?
*runs off to be grumpy somewhere else*

User avatar
skeptical scientist
closed-minded spiritualist
Posts: 6142
Joined: Tue Nov 28, 2006 6:09 am UTC
Location: San Francisco

Re: IBM quantum computing

Postby skeptical scientist » Tue Feb 28, 2012 12:01 pm UTC

Ghostbear wrote:I'm also curious as to another part of the article- a qubit lasting 100 μs not being around long enough for computation would indicate a rather slow clockspeed, only in the KHz range (10 KHz if there was no delay involved with any other functions related to reading/writing/etc. the qubit). I don't know much about quantum computing, but are low clockspeeds a fundamental characteristic of them, or is it just part of the "we're still working on it" factor?

Actually the article described 100 μs as a major improvement over the previous "few billionths of a second", and meant that quantum computing was "becoming doable".

Sadly though, the NYT made a similar error:
New York Times wrote:For certain problems like searching databases or factoring very large numbers — the basis of today’s encryption techniques — quantum computers could produce an answer in days or maybe even seconds, whereas the fastest conventional computer would take longer than 13.7 billion years.

Would take 13.7 billion years to do.. what? Factor a very large number? What size of very large number? Which fastest conventional computer?
*runs off to be grumpy somewhere else*

It means there are certain problems which a quantum computer could do in days or maybe even seconds, while the fastest conventional computer running a known algorithm would take longer than 13.7 billion years to solve them. This is true. Some of these problems involve factoring large numbers. I don't see why you're bothered by this. This is not at all like the error that so bothered me about the computerworld article: that author doesn't even know the difference between decryption and factorization! How an IT news site can assign someone like that to write an article on quantum computers is beyond me.
I'm looking forward to the day when the SNES emulator on my computer works by emulating the elementary particles in an actual, physical box with Nintendo stamped on the side.

"With math, all things are possible." —Rebecca Watson

Ghostbear
Posts: 1764
Joined: Sat Apr 26, 2008 10:06 pm UTC

Re: IBM quantum computing

Postby Ghostbear » Tue Feb 28, 2012 12:19 pm UTC

skeptical scientist wrote:Actually the article described 100 μs as a major improvement over the previous "few billionths of a second", and meant that quantum computing was "becoming doable".

I know. I was curious if the fact that the majorly improved quantity was still fairly long was indicative of the potential clockspeed of quantum computers or if it was just part of the "growing pains" of a yet incomplete technology. I don't know enough about quantum computing to say one way or the other (in fact, I don't know enough about it to know if clockspeed is even a useful metric when dealing with it, or if the influence of the clockspeed is different such that ~10 KHz would be good, or something else entirely).

skeptical scientist wrote:It means there are certain problems which a quantum computer could do in days or maybe even seconds, while the fastest conventional computer running a known algorithm would take longer than 13.7 billion years to solve them. This is true. Some of these problems involve factoring large numbers. I don't see why you're bothered by this. This is not at all like the error that so bothered me about the computerworld article: that author doesn't even know the difference between decryption and factorization! How an IT news site can assign someone like that to write an article on quantum computers is beyond me.

I'm bothered by it because it gives you very little real information. How are they defining "very large number", which "conventional computer" is the "fastest"? Are they assuming ideal OPS, practical OPS, or OPS while working on that specific problem? If you came across that same article five years from now (when the basic gist of it should still have some relevance) then you'll have to be able to guess what the fastest computer was five years ago, if that's the one they meant, if the defining size of a very large number has changed, etc. I know exactly what was meant- that certain problems are significantly easier on a quantum computer than an electrical computer- but the way they compare the two left me wondering how they got those specific numbers. The error made at the other place is worse, for the reasons you mentioned, but I am still bothered by the lack of specific information in that passage.

User avatar
Zamfir
I built a novelty castle, the irony was lost on some.
Posts: 7604
Joined: Wed Aug 27, 2008 2:43 pm UTC
Location: Nederland

Re: IBM quantum computing

Postby Zamfir » Tue Feb 28, 2012 12:29 pm UTC

Ghostbear wrote:I'm also curious as to another part of the article- a qubit lasting 100 μs not being around long enough for computation would indicate a rather slow clockspeed, only in the KHz range (10 KHz if there was no delay involved with any other functions related to reading/writing/etc. the qubit). I don't know much about quantum computing, but are low clockspeeds a fundamental characteristic of them, or is it just part of the "we're still working on it" factor?


I am sure as hell not an expert here, but I think the qubits have to 'last' for all of the computation, not a single step of it.

EDIT: did more reading, said wrong things.

Ghostbear
Posts: 1764
Joined: Sat Apr 26, 2008 10:06 pm UTC

Re: IBM quantum computing

Postby Ghostbear » Tue Feb 28, 2012 12:51 pm UTC

Zamfir wrote:I am sure as hell not an expert here, but I think the qubits have to 'last' for all of the computation, not a single step of it. Don't trust me on this, but I think quantum calculations don't overwrite qubits. They set up a series of "gates", particular relationships between one set of qubits and another set.

So you start with an input register of qubits. Then you set up a series of gates in such a way that another register far away can be read by macroscopic means, with a high probability of yielding the desired answer. In particular, there's a series of gates where the output register has a high probability of being the discrete fourier transform of the data encoded in the input register, and apparently you can build fancier stuff around that.

Someone with more knowledge should double-check this...

I originally wrote out a question of how that'd still be analogous to a single clock cycle before I got the full gist of what you're saying. In between all the various stages that an operation goes through for a single cycle on an electrical processor, the bits can be refreshed or rewritten (reminiscent of how DRAM works), while with a qubit they can't. Still, wouldn't the fact that it only lasting 100 μs and needing to last for the full operation still indicate a very low clockspeed? Or does "all of computation" mean something more along the lines of "for every cycle needed to be performed to get the final answer that we want" (the difference between "add 0101 to 0111" and "render this video frame")?

Sorry for all the questions- I meant it when I said I knew little about quantum computing (and I fear my knowledge of electrical computing might put me at a "mount stupid"* point for trying to interpret quantum) :).

* Hah, I just noticed how fitting the last part of that is.

Also: damn DFTs, showing up everywhere.

User avatar
Zamfir
I built a novelty castle, the irony was lost on some.
Posts: 7604
Joined: Wed Aug 27, 2008 2:43 pm UTC
Location: Nederland

Re: IBM quantum computing

Postby Zamfir » Tue Feb 28, 2012 1:10 pm UTC

Ah, I just said that that was wrong. I had different registers in mind, but thet are the same set of qubits at different points in time, so there is 'overwriting'. But it's still true that only the last step can be read out.

The qubits store data as a superposition of states, so by measuring them you only randomly get one of those states and you lose the other states (which alo contain information). The last step is special, with all the relevant information in one state that has a high probability of being measured. Until you get to that step, you have to have virtually no interaction with the outside world.

Tirian
Posts: 1891
Joined: Fri Feb 15, 2008 6:03 pm UTC

Re: IBM quantum computing

Postby Tirian » Tue Feb 28, 2012 5:14 pm UTC

Ghostbear wrote:I wonder what exactly they mean by 15 years away. It might not be meant in the sense that we would inherently interpret. Instead of meaning "in 15 years, quantum computing will be at such a stage that it is useful" it might just mean "in 15 years, quantum computing will be possible"- there could easily be another 10 year or longer gap from the second scenario to the first one (if I recall correctly, a lot of the "HD" display tech was first created in the 1970's, for instance). I don't think it's that unlikely of an error either- it seems pretty common for the press to misinterpret something a scientist says, either out of ignorance or sensationalism.


Saying that something is five years away from market means that you're entering the last few phases of trials and you hope that all the small problems you've seen stay small.

Saying that something is ten years away from market means that they've got a good base of basic science and a large pile of data and they hope that sometime in the next three years someone will have an AHA moment and that will give you two years to do some successful preliminary trials before you hit that five year benchmark. Of course, insight is unpredictable in individual cases. An enterprising sociologist doctoral student might enjoy investigating whether there is some sort of "half-life" of scientific progress that would make this sort of prediction any more than a wild guess.

Saying that something is fifteen years away means that we need one or two AHA moments to get the basic science or the pile of relevant data and that we can't even say that we're ten years away with a straight face. Scientists have claimed that fusion reactors have been fifteen years away from market for the past fifty years.

User avatar
Link
Posts: 1419
Joined: Sat Mar 07, 2009 11:33 am UTC
Location: ᘝᓄᘈᖉᐣ
Contact:

Re: IBM quantum computing

Postby Link » Tue Feb 28, 2012 6:43 pm UTC

Going by #678, I guess 15 years puts it in "it's probably possible (well, maybe), but we have no idea how to actually do it" territory.

User avatar
thc
Posts: 643
Joined: Fri Feb 08, 2008 6:01 am UTC

Re: IBM quantum computing

Postby thc » Tue Feb 28, 2012 10:34 pm UTC

skeptical scientist wrote:unreasonably realistic

Oh, to be unreasonably realistic!

User avatar
sourmìlk
If I can't complain, can I at least express my fear?
Posts: 6393
Joined: Mon Dec 22, 2008 10:53 pm UTC
Location: permanently in the wrong
Contact:

Re: IBM quantum computing

Postby sourmìlk » Tue Feb 28, 2012 11:15 pm UTC

My favorite moment of tech-reporting idiocy was some article that said "coding is one of the most popular languages these days". I /facepalmed.

Also, will quantum computers change the way programming works? Like, a qubit isn't just a binary, right? So it seems like that would screw with conventional programming by rendering boolean logical operations obsolete. I don't want to have to relearn how to program :(
Terry Pratchett wrote:The trouble with having an open mind, of course, is that people will insist on coming along and trying to put things in it.

User avatar
Dauric
Posts: 3995
Joined: Wed Aug 05, 2009 6:58 pm UTC
Location: In midair, traversing laterally over a container of sharks. No water, just sharks, with lasers.

Re: IBM quantum computing

Postby Dauric » Tue Feb 28, 2012 11:26 pm UTC

sourmìlk wrote:Also, will quantum computers change the way programming works? Like, a qubit isn't just a binary, right? So it seems like that would screw with conventional programming by rendering boolean logical operations obsolete. I don't want to have to relearn how to program :(


I'd imagine that it would add concepts to a quantum programming language rather than replace the entire architecture of how to program. Checking a purely true-false condition wouldn't be improved by replacing the basic "If Then" statement with one that delves in to the quantum states of the checkbox being hit and whether Schrodinger's cat has been killed because of it. You'd probably have to learn some new ways to evaluate conditionals, but everything you already know would at least be supported.

The other thing is most of the difference will be absorbed in the transition from low-level to mid level language. Most programmers will be working with something that resembles some recognizable variant of C (or java, or python, etc., etc., etc.,), they're not going to be working in the qubit equivalent of assembly language. Quantum C will probably have a lot of the same notation, mathematical operations, conditional evaluations, etc., etc. as modern C with all the translation work to a qubit system done by the compiler.
We're in the traffic-chopper over the XKCD boards where there's been a thread-derailment. A Liquified Godwin spill has evacuated threads in a fourty-post radius of the accident, Lolcats and TVTropes have broken free of their containers. It is believed that the Point has perished.

User avatar
Shivahn
Posts: 2200
Joined: Tue Jan 06, 2009 6:17 am UTC

Re: IBM quantum computing

Postby Shivahn » Tue Feb 28, 2012 11:43 pm UTC

Also, as I've learned (and been harping on recently, here and elsewhere), learning programming isn't really about if (x==10) cout << "10";. The most important part of programming is making modular, well-documented, intuitive code that hides whatever it can behind abstraction barriers. All of that is going to transfer over: at worst, things like how we approach problems will have to change - not a super small task, but hardly monumental.

User avatar
sourmìlk
If I can't complain, can I at least express my fear?
Posts: 6393
Joined: Mon Dec 22, 2008 10:53 pm UTC
Location: permanently in the wrong
Contact:

Re: IBM quantum computing

Postby sourmìlk » Tue Feb 28, 2012 11:47 pm UTC

I absolutely agree, but if you have to start relearning things as basic as if (x==10) cout <<" 10"; then it makes things a lot harder.
Terry Pratchett wrote:The trouble with having an open mind, of course, is that people will insist on coming along and trying to put things in it.

User avatar
Shivahn
Posts: 2200
Joined: Tue Jan 06, 2009 6:17 am UTC

Re: IBM quantum computing

Postby Shivahn » Tue Feb 28, 2012 11:54 pm UTC

Oh, yeah. Well, Dauric is right that if that all changes, the only people who are going to need to learn it are likely assembly programmers (though of course, higher language programmers would benefit as well.)

User avatar
Dauric
Posts: 3995
Joined: Wed Aug 05, 2009 6:58 pm UTC
Location: In midair, traversing laterally over a container of sharks. No water, just sharks, with lasers.

Re: IBM quantum computing

Postby Dauric » Wed Feb 29, 2012 12:18 am UTC

The other thing is that the vast majority of programming is essentially mathematics. The symbols that we type in to the computer to make it do stuff have meanings that predate the computer, and just because your new calculator is electric rather than beads on a string doesn't mean that the basic fundamentals of mathematics have changed.

Again, we're likely to have to learn new symbols to really get the most out of a quantum processing device, but an accounting spreadsheet will still have to recognize '<' as 'less than', and...

x=4
y=10

x<y will still evaluate to 'true' or someone's going to get fired for 'creative accounting'.

I'd guess that we'll get something like '<~', '>~' and '=~' to check for the new "Maybe" state, but I doubt that we'd have to learn '#^%' is the conditional evaluation to test for 'frud'.
We're in the traffic-chopper over the XKCD boards where there's been a thread-derailment. A Liquified Godwin spill has evacuated threads in a fourty-post radius of the accident, Lolcats and TVTropes have broken free of their containers. It is believed that the Point has perished.

KnightExemplar
Posts: 5494
Joined: Sun Dec 26, 2010 1:58 pm UTC

Re: IBM quantum computing

Postby KnightExemplar » Wed Feb 29, 2012 12:37 am UTC

Isn't one of the big issues with quantum computing keeping all of the bits entwined together?

I'm not an expert, but was introduced to the concept of quantum computing in my Computer Architecture class. New quantum gates and quantum calculations and so forth. Its a new system of computation that holds promise (like O(n) speed sorting). However, my understanding of quantum computing was that keeping a number of bits entwined is a very difficult problem.

The largest quantum computer, last time I checked, was 14-qbits. To perform Shor's algorithm to break 1024-bit RSA encryption, you'll need a 2048-qbit register AND a 1024-qbit register. Until we start getting qbits the magnitude as we need... there's almost no point thinking more about quantum computing. I'm not sure how many qbits are needed to be "useful" in a typical situation, but certainly more than the current world record of 14.


EDIT:
Shivahn wrote:Oh, yeah. Well, Dauric is right that if that all changes, the only people who are going to need to learn it are likely assembly programmers (though of course, higher language programmers would benefit as well.)


Consider the major issues people have with making SIMD mainstream. SIMD instructions have been around for almost two decades now on normal computers. (MMX comming out on the Pentium in 1996). However, no compiler can really automatically translate your "for" loops into SIMD commands. Instead, a new language called CUDA (or OpenCL), which was explicitly designed with the "SIMD mindset" has been created.

When the primitives change, the programming model changes. To take advantage of SIMD acceleration, you either dip down into Assembly language (or equivalent: the C intrinsic functions), or use high-level languages explicitly designed to take advantage of these things.
First Strike +1/+1 and Indestructible.

bosonicyouth
Posts: 92
Joined: Sat May 01, 2010 8:59 am UTC
Location: Brisbane, Australia

Re: IBM quantum computing

Postby bosonicyouth » Wed Feb 29, 2012 2:44 am UTC

It seems to be a common misconception that computers in the future will be entirely quantum right at the bottom. This makes no sense practically, since quantum analogues of most algorithms offer no speedup (indeed, with all the error correction required, the quantum version would always be slower). I imagine that when we do have a working quantum computer, it would essentially be a separate processor attached to a classical computer, and it would be designed specifically to work on some classically NP complete problem, returning an answer and some error estimate. It is the classical part of the computer's job then to interpret code, convert between NP problems, check answers, etc. Fifteen years for something like this I think is not an unreasonable estimate. Having some entirely quantum computer with some kind of quantum assembly language is another thing altogether, but fortunately it's not necessary.

User avatar
sourmìlk
If I can't complain, can I at least express my fear?
Posts: 6393
Joined: Mon Dec 22, 2008 10:53 pm UTC
Location: permanently in the wrong
Contact:

Re: IBM quantum computing

Postby sourmìlk » Wed Feb 29, 2012 3:32 am UTC

I like optic computers better. The mechanics are different, but the theory is the same.
Terry Pratchett wrote:The trouble with having an open mind, of course, is that people will insist on coming along and trying to put things in it.

User avatar
Thesh
Made to Fuck Dinosaurs
Posts: 6598
Joined: Tue Jan 12, 2010 1:55 am UTC
Location: Colorado

Re: IBM quantum computing

Postby Thesh » Wed Feb 29, 2012 3:52 am UTC

skeptical scientist wrote:Using known quantum computer algorithms, it only breaks crypto which is based on prime factorization being hard


Well, I wouldn't say "Only". Prime factorization, discrete logarithms, and elliptical curves are all used for public-key cryptography, and are all likely to become breakable with quantum computers sometime in the future. However, there is research being done on algorithms that would not be breakable with known algorithms, quantum or otherwise. Let's just hope that we can make them widespread before quantum computing becomes practical to governments.

Although my biggest concern right now is that AES will be breakable in the next 15 years.

This article has a short mention about methods for creating asymmetric algorithms that may be secure against quantum attacks:
http://en.wikipedia.org/wiki/Post-quantum_cryptography
Summum ius, summa iniuria.

User avatar
skeptical scientist
closed-minded spiritualist
Posts: 6142
Joined: Tue Nov 28, 2006 6:09 am UTC
Location: San Francisco

Re: IBM quantum computing

Postby skeptical scientist » Wed Feb 29, 2012 6:33 am UTC

Thesh wrote:
skeptical scientist wrote:Using known quantum computer algorithms, it only breaks crypto which is based on prime factorization being hard

Well, I wouldn't say "Only". Prime factorization, discrete logarithms, and elliptical curves are all used for public-key cryptography, and are all likely to become breakable with quantum computers sometime in the future.

Yes, those all become breakable using Shor's algorithm, which is an algorithm for integer factorization. I stand by what I said. It just so happens that the class of crypto based on prime factorization being hard happens to include all public-key cryptosystems, which makes it a rather big deal.
I'm looking forward to the day when the SNES emulator on my computer works by emulating the elementary particles in an actual, physical box with Nintendo stamped on the side.

"With math, all things are possible." —Rebecca Watson

Ghostbear
Posts: 1764
Joined: Sat Apr 26, 2008 10:06 pm UTC

Re: IBM quantum computing

Postby Ghostbear » Wed Feb 29, 2012 8:15 am UTC

Tirian wrote:Saying that something is five years away from market means that you're entering the last few phases of trials and you hope that all the small problems you've seen stay small.

Saying that something is ten years away from market means that they've got a good base of basic science and a large pile of data and they hope that sometime in the next three years someone will have an AHA moment and that will give you two years to do some successful preliminary trials before you hit that five year benchmark. Of course, insight is unpredictable in individual cases. An enterprising sociologist doctoral student might enjoy investigating whether there is some sort of "half-life" of scientific progress that would make this sort of prediction any more than a wild guess.

Saying that something is fifteen years away means that we need one or two AHA moments to get the basic science or the pile of relevant data and that we can't even say that we're ten years away with a straight face. Scientists have claimed that fusion reactors have been fifteen years away from market for the past fifty years.

My question was more along the lines of: Is the scientist actually saying 15 years to market? It sounded to me more like he was saying that quantum computing would be feasible in 15 years- not that it would be market ready. I wouldn't see it as too huge of a stretch for the 15 years to refer to quantum computing being less of an in-lab thing but more actually existing in some other places, but still not advanced far enough, or too expensive, or otherwise mitigated by some limitation, to actually be worth using. Even judging by its typical limitations (superconductivity seems to be common with it) I don't know that we can ever expect quantum computing to replace electrical computing in the mainstream market without some fundamental breakthroughs. More likely, it'll be used for some special tasks with major institutions, with the rest of our computing needs handled by more conventional methods.

maybeagnostic
Posts: 669
Joined: Mon Apr 07, 2008 3:34 pm UTC

Re: IBM quantum computing

Postby maybeagnostic » Wed Feb 29, 2012 2:57 pm UTC

Dauric wrote:I'd guess that we'll get something like '<~', '>~' and '=~' to check for the new "Maybe" state, but I doubt that we'd have to learn '#^%' is the conditional evaluation to test for 'frud'.

As Dauric mentions, quantum computers will be able to do all the standard boolean operations i.e. they have AND, OR, NOT, NAND and so on*. They also have other kinds of more exotic gates that don't make any sense for a normal computer and require a lot of advanced algebra to understand. A normal bit always has a value of 0 or 1; a qubit's state is, I believe, described by a pair of complex numbers which can be collapsed to a value between 0 and 1 but there would be many states mapping to the same real number for most numbers. Quantum algorithms always end in a state that uniquely maps to 0 or 1 which allows us to read out the answer, unlike normal computers, however, we are completely unaware of the state mid-algorithm and trying to read it out doesn't give us enough information to reconstruct it AND wipes it out at the same time. That's why the we can't just perform 10 operations, read out the state, map it onto new qubits and continue.

* Of course normal computer would be much faster and cheaper for these normal operations for a very long time.
T: ... through an emergency induction port.
S: That's a straw, Tali.
T: Emerrrgency induction port.

User avatar
Dauric
Posts: 3995
Joined: Wed Aug 05, 2009 6:58 pm UTC
Location: In midair, traversing laterally over a container of sharks. No water, just sharks, with lasers.

Re: IBM quantum computing

Postby Dauric » Wed Feb 29, 2012 3:12 pm UTC

maybeagnostic wrote:They also have other kinds of more exotic gates that don't make any sense for a normal computer and require a lot of advanced algebra to understand.


So in the plausible future (assuming I'm collecting our respective evaluations correctly) we're looking at a multi-processor system, one traditional one quantum, and the complexity of the assembly language for the quantum processor (because of the complex mathematics in quantum mechanics) pretty much insures that any coding language that makes use of the quantum processor will require a mid-to-high-level language that provides significant abstraction barriers between the actual mathematical formula in quantum mechanics and "output = FunctionToDoUsefulThing(input)"
We're in the traffic-chopper over the XKCD boards where there's been a thread-derailment. A Liquified Godwin spill has evacuated threads in a fourty-post radius of the accident, Lolcats and TVTropes have broken free of their containers. It is believed that the Point has perished.

maybeagnostic
Posts: 669
Joined: Mon Apr 07, 2008 3:34 pm UTC

Re: IBM quantum computing

Postby maybeagnostic » Wed Feb 29, 2012 4:55 pm UTC

Dauric wrote:
maybeagnostic wrote:They also have other kinds of more exotic gates that don't make any sense for a normal computer and require a lot of advanced algebra to understand.


So in the plausible future (assuming I'm collecting our respective evaluations correctly) we're looking at a multi-processor system, one traditional one quantum, and the complexity of the assembly language for the quantum processor (because of the complex mathematics in quantum mechanics) pretty much insures that any coding language that makes use of the quantum processor will require a mid-to-high-level language that provides significant abstraction barriers between the actual mathematical formula in quantum mechanics and "output = FunctionToDoUsefulThing(input)"

Yes, if quantum processors become widespread they will likely be in addition to rather than a replacement for normal processors in the beginning. I am not sure they will be so widespread though. While quantum computing can definitely be extremely useful for breaking cryptography and solving NP-complete problems in general our use of computers right now avoids tackling NP-complete problems because they are prohibitively expensive to compute. I can imagine specialized decryption devices but most things might be cloud-based- for example approximating path finding and travelling salesman solutions is already mostly done in the cloud. Eventually ithey might spread as people develop applications specifically for quantum computing but that is going to take years (decades?) after these kinds of processors are available.

As for the programming language... nowadays you might have a programmer who doesn't understand boolean logic but that'd be a pretty terrible programmer. Similarly, you need an understanding of quantum logic to be able to design and understand quantum algorithms. The only difference is that boolean logic is 'natural'- we encounter it every single day in the world; quantum logic is alien to our everyday lives and many of its operations have no analogue in the macroscopic world.

So, yes, at first someone will probably write a set of immediately obvious and useful quantum algorithms and put them in a library that can be called from... C++ or Java or whatever. Anyone who wants to develop rather than just use quantum computing would still need to learn the basic operations which is a lot more difficult than with normal computing. What kinds of programming languages might be developed for that? No idea. The programming languages we have right now are good at doing what we use our computers to do which is what they, in turn, are good at. Quantum computers would be good at different (or just more?) things so the languages will change accordingly. Initially they would just have to include the quantum operations but that is assembly level stuff and it would only be accessible from our current languages in a very restricted way.
T: ... through an emergency induction port.
S: That's a straw, Tali.
T: Emerrrgency induction port.

jareds
Posts: 436
Joined: Wed Jan 03, 2007 3:56 pm UTC

Re: IBM quantum computing

Postby jareds » Thu Mar 01, 2012 12:12 am UTC

maybeagnostic wrote:While quantum computing can definitely be extremely useful for breaking cryptography and solving NP-complete problems in general our use of computers right now avoids tackling NP-complete problems because they are prohibitively expensive to compute.

"Quantum computers are not known to be able to solve NP-complete problems in polynomial time" -Scott Aaronson

I believe this is in the header of his blog precisely because it is a common misconception.

User avatar
skeptical scientist
closed-minded spiritualist
Posts: 6142
Joined: Tue Nov 28, 2006 6:09 am UTC
Location: San Francisco

Re: IBM quantum computing

Postby skeptical scientist » Fri Mar 02, 2012 8:28 pm UTC

maybeagnostic wrote:Our use of computers right now avoids tackling NP-complete problems because they are prohibitively expensive to compute.

That's not quite true either. In practice, some NP-hard problems are surprisingly tractable.
I'm looking forward to the day when the SNES emulator on my computer works by emulating the elementary particles in an actual, physical box with Nintendo stamped on the side.

"With math, all things are possible." —Rebecca Watson

letterX
Posts: 535
Joined: Fri Feb 22, 2008 4:00 am UTC
Location: Ithaca, NY

Re: IBM quantum computing

Postby letterX » Fri Mar 02, 2012 11:10 pm UTC

skeptical scientist wrote:
maybeagnostic wrote:Our use of computers right now avoids tackling NP-complete problems because they are prohibitively expensive to compute.

That's not quite true either. In practice, some NP-hard problems are surprisingly tractable.

Yup. I'm actually sitting here right this second solving some NP complete problems to optimality (or pretty close, anyways). And while it's taking longer than I'd like, that's mostly because there's rather a lot of them, and I have a paper deadline on Monday that they really need to be done for...

User avatar
Minerva
Posts: 947
Joined: Sat Nov 11, 2006 2:58 pm UTC
Location: Australia
Contact:

Re: IBM quantum computing

Postby Minerva » Sat Mar 03, 2012 11:09 am UTC

A quantum computer can only offer certain algorithms for performing certain types of computer science problems (Shor's algorithm for factorisation, or Grover's algorithm for search in an unsorted database) with better computational complexity than any known classical algorithm for doing the same practical job.

Quantum computers are not known to be able to solve all (or any!) NP-complete problems in polynomial time.

But those kinds of increases in algorithm performance for certain specific computer science tasks such as search in an unsorted list, or large-integer factorisation, do have great practical value, in many cases.

It's relatively easy to build quantum computers, but only with a limited number of entangled qubits. We can easily build a QC, and enter, say, 15 as the input into Shor's algorithm, and it will quickly complete the algorithm, in polynomial time, and return the prime factors which are 5 and 3 in this case. (This was famously demonstrated in 2001 using a liquid bulk ensemble NMR system with 7 qubits - a real, working, bonafide quantum computer.) But you don't have enough qubits in the system to play with when you try and put a 1024-bit RSA key into it, so fortunately/unfortunately it doesn't work.

The developments in QC research today are nothing to do with building a quantum computer, that was first done years ago. It's all about building scalable quantum computer devices with a scalable number of hardware qubits in the device working together without factors like decoherence becoming impractical.

And, of course, even if you have a quantum computer, much to the dismay of journalists, quantum computing does not offer "billion times faster computer!!one".
...suffer from the computer disease that anybody who works with computers now knows about. It's a very serious disease and it interferes completely with the work. The trouble with computers is you play with them. They are so wonderful. - Richard Feynman

User avatar
skeptical scientist
closed-minded spiritualist
Posts: 6142
Joined: Tue Nov 28, 2006 6:09 am UTC
Location: San Francisco

Re: IBM quantum computing

Postby skeptical scientist » Sun Mar 04, 2012 6:34 am UTC

Yeah, I'm not quite sure why people are so eager for quantum computers. It's not like they're dramatically faster in general, just that they are good at solving certain types of hard problems. And the hard problems that they are good at solving are mostly problems we want to be hard, not problems that we wish were easy.
I'm looking forward to the day when the SNES emulator on my computer works by emulating the elementary particles in an actual, physical box with Nintendo stamped on the side.

"With math, all things are possible." —Rebecca Watson

Vanzetti
Posts: 64
Joined: Tue Nov 18, 2008 7:31 pm UTC

Re: IBM quantum computing

Postby Vanzetti » Sun Mar 04, 2012 11:49 am UTC

skeptical scientist wrote:Yeah, I'm not quite sure why people are so eager for quantum computers. It's not like they're dramatically faster in general, just that they are good at solving certain types of hard problems. And the hard problems that they are good at solving are mostly problems we want to be hard, not problems that we wish were easy.


Biologist here. Protein folding, that's why. We want it to be easy. It is bloody hard.

User avatar
skeptical scientist
closed-minded spiritualist
Posts: 6142
Joined: Tue Nov 28, 2006 6:09 am UTC
Location: San Francisco

Re: IBM quantum computing

Postby skeptical scientist » Sun Mar 04, 2012 12:57 pm UTC

Vanzetti wrote:Biologist here. Protein folding, that's why. We want it to be easy. It is bloody hard.

Yeah, but isn't that an example of a problem that might still be hard even with quantum computers, for all we know?
I'm looking forward to the day when the SNES emulator on my computer works by emulating the elementary particles in an actual, physical box with Nintendo stamped on the side.

"With math, all things are possible." —Rebecca Watson

Vanzetti
Posts: 64
Joined: Tue Nov 18, 2008 7:31 pm UTC

Re: IBM quantum computing

Postby Vanzetti » Sun Mar 04, 2012 3:56 pm UTC

skeptical scientist wrote:
Vanzetti wrote:Biologist here. Protein folding, that's why. We want it to be easy. It is bloody hard.

Yeah, but isn't that an example of a problem that might still be hard even with quantum computers, for all we know?


Protein simulation is not my area of expertise, but I read in a several places that quantum computers will make it easier. I hope it's not just hype. Efficient and fast protein simulation will make life so much better. Just imagine designing a drug molecule and testing it in silico against all human proteins.

letterX
Posts: 535
Joined: Fri Feb 22, 2008 4:00 am UTC
Location: Ithaca, NY

Re: IBM quantum computing

Postby letterX » Sun Mar 04, 2012 5:39 pm UTC

skeptical scientist wrote:
Vanzetti wrote:Biologist here. Protein folding, that's why. We want it to be easy. It is bloody hard.

Yeah, but isn't that an example of a problem that might still be hard even with quantum computers, for all we know?

Actually, simulating quantum systems is one of the problems known to be signficantly faster on quantum computers. I don't know that that applies immediately to protein folding, since I don't think quantum effects dominate in proteins (more important are the hydrophilic/hydrophobic effects) but in general, simulating molecules and their behaviors is likely to be much more tractable with quantum computing.


Return to “News & Articles”

Who is online

Users browsing this forum: No registered users and 17 guests