Moore's law

A place to discuss the science of computers and programs, from algorithms to computability.

Formal proofs preferred.

Moderators: phlip, Moderators General, Prelates

Shadowfish
Posts: 309
Joined: Tue Feb 27, 2007 2:27 am UTC

Moore's law

Postby Shadowfish » Fri Mar 16, 2007 5:06 pm UTC

How long do you think it will continue? Will we reach physical limits, or will we be able to pass them?

User avatar
LE4dGOLEM
is unique......wait, no!!!!
Posts: 5972
Joined: Thu Oct 12, 2006 7:10 pm UTC
Location: :uoıʇɐɔol

Postby LE4dGOLEM » Fri Mar 16, 2007 5:13 pm UTC

Moore's law is the hardware-capacity-doubles-every-eighteen-months one right? If so, I think it'll begin to slow down after a while, but no, I don't think we will "hit a limit" of potential computer power. software however, perhaps.
Image Une See Fights - crayon super-ish hero webcomic!
doogly wrote:It would just be much better if it were not shitty.

User avatar
Peshmerga
Mad Hatter
Posts: 2061
Joined: Wed Oct 04, 2006 1:56 am UTC
Contact:

Postby Peshmerga » Fri Mar 16, 2007 6:45 pm UTC

The man has already stated that his law will be no longer applicable beyond a certain date. I forget exactly when, but unless a newer technology becomes readily available, transister size will reach a human limit.

As for technology in of itself, I would predict that it will remain on a steadily increasing rate, assuming no world disaster fucks us three ways from Tuesday.
i hurd u liek mudkips???

User avatar
hyperion
"I'll show ye...."
Posts: 1569
Joined: Wed Nov 29, 2006 2:16 pm UTC
Location: Perth

Postby hyperion » Sat Mar 17, 2007 1:57 am UTC

once (if?) we reach the one-electron-transistor, the law will stop applying since we really couldn't get much smaller, so a specific density limit will be reached.
Peshmerga wrote:A blow job would probably get you a LOT of cheeseburgers.
But I digress.

User avatar
Yakk
Poster with most posts but no title.
Posts: 11129
Joined: Sat Jan 27, 2007 7:27 pm UTC
Location: E pur si muove

Postby Yakk » Sat Mar 17, 2007 2:46 am UTC

If you track technolgical information processing density and cost from before the invention of the transistor to today...

It follows moores law. All the way back to the jaquard loom, the difference engine, IBM counting machines, vacuum tube computation, transistors, integrated circuits.

All the way to today.

This implies there is something greater than the properties of integrated circuit development going on.

Recently, moore's law has been broken -- tech has been growing faster than it predicts. I've seen some reasonable explainations behind this (from more of the economy being thrown at the problem, to feedback effects from using computer tech to improve computer tech), but it is interesting that moore's law is now a lower bound more than an approximation.

User avatar
LE4dGOLEM
is unique......wait, no!!!!
Posts: 5972
Joined: Thu Oct 12, 2006 7:10 pm UTC
Location: :uoıʇɐɔol

Postby LE4dGOLEM » Sat Mar 17, 2007 11:01 am UTC

HYPERiON wrote:once (if?) we reach the one-electron-transistor, the law will stop applying since we really couldn't get much smaller, so a specific density limit will be reached.


What about one-photon transistors, or things better than transistors? Or even, um, implied electron transistors (I don't know)?
Image Une See Fights - crayon super-ish hero webcomic!
doogly wrote:It would just be much better if it were not shitty.

demon
Posts: 170
Joined: Tue Feb 06, 2007 8:13 pm UTC

Postby demon » Sat Mar 17, 2007 6:01 pm UTC

correct me if i'm wrong, but once we learn to use more quantum states than just 0 and 1, wouldn't we be able to cram more data in one place, getting even >1 bit/electron in total in a memory bank, thus allowing the law to hold and breaking that hypothetical density limit?

User avatar
LE4dGOLEM
is unique......wait, no!!!!
Posts: 5972
Joined: Thu Oct 12, 2006 7:10 pm UTC
Location: :uoıʇɐɔol

Postby LE4dGOLEM » Sat Mar 17, 2007 6:02 pm UTC

demon wrote:correct me if i'm wrong, but once we learn to use more quantum states than just 0 and 1, wouldn't we be able to cram more data in one place, getting even >1 bit/electron in total in a memory bank, thus allowing the law to hold and breaking that hypothetical density limit?


You'd have to teach the computer what this not-zero not-one state is though.
Image Une See Fights - crayon super-ish hero webcomic!
doogly wrote:It would just be much better if it were not shitty.

demon
Posts: 170
Joined: Tue Feb 06, 2007 8:13 pm UTC

Postby demon » Sat Mar 17, 2007 8:28 pm UTC

and isn't that precisely what quantum computing is?

User avatar
OmenPigeon
Peddler of Gossamer Lies
Posts: 673
Joined: Mon Sep 25, 2006 6:08 am UTC
Contact:

Postby OmenPigeon » Sat Mar 17, 2007 9:49 pm UTC

demon wrote:and isn't that precisely what quantum computing is?


Not... really? 'Quantum computing' as most people mean it is, basically, a probabilistic Turing machine. To simplify things somewhat, where a regular Turing machine has exactly one thing it can do for a particular combination of {tape contents, head location, state} a probabilistic one can have a number of choices for head actions and state transitions. Moreover, it can, in some sense, choose them all the same time. Exploiting this allows for complexity gains in some traditionally intractable problems, like factoring large numbers.

As far as I know, no one has proposed a method for data storage which takes advantage of quantum states.

Tangentially, there are also a few people who propose using quantum effects to solve problems which Turing machines cannot solve (PDF). Whether or not such a machine would work is a matter of some disagreement. It certainly hasn't been built.
As long as I am alive and well I will continue to feel strongly about prose style, to love the surface of the earth, and to take pleasure in scraps of useless information.
~ George Orwell

demon
Posts: 170
Joined: Tue Feb 06, 2007 8:13 pm UTC

Postby demon » Sun Mar 18, 2007 10:34 am UTC

right, that's what you get for trusting your parents, i knew i should've checked it on wiki first:D

User avatar
evilbeanfiend
Posts: 2650
Joined: Tue Mar 13, 2007 7:05 am UTC
Location: the old world

Postby evilbeanfiend » Mon Mar 19, 2007 9:44 am UTC

for traditional transistors in silicon chips we may not reach the physical limit as there is also moore's second law, every time the number of transistors on a chip double the fabrication cost also doubles. crucially though it looks like moore's second law may not apply to other technologies

User avatar
Yakk
Poster with most posts but no title.
Posts: 11129
Joined: Sat Jan 27, 2007 7:27 pm UTC
Location: E pur si muove

Postby Yakk » Mon Mar 19, 2007 4:40 pm UTC

QC is not a NDTM -- or at least, nobody has managed to prove that QC is powerful enough to solve all NDTM problems, nor have they generated a QC algorithm that solves an NP problem in P time.

The operations allowed in QC are strange and unlike what we are used to.

There is, I believe, a proof that QC-PTIME is in TM-PSPACE (a flatmate of mine did a paper fixing up part of that proof -- the existing proof seems a bit waffly on some of the floating point approximation errors possibly making the simulation not match up to the real QC result), but I'm not absolutely certain.

User avatar
Bluesprite
Posts: 47
Joined: Tue Mar 27, 2007 2:41 am UTC

Postby Bluesprite » Tue Mar 27, 2007 2:44 am UTC

Question re: quantum computing.

I've done only a limited amount of research into the theory of computation, and wikipedia has been insufficient, so I thought maybe someone here might be able to answer.

Is it the case that NP-Complete problems will be tractable under quantum computers?

You said that a quantum machine is a probabalistic turing machine, which I assume is distinct from both deterministic and non-deterministic turing machines?

User avatar
OmenPigeon
Peddler of Gossamer Lies
Posts: 673
Joined: Mon Sep 25, 2006 6:08 am UTC
Contact:

Postby OmenPigeon » Tue Mar 27, 2007 3:33 am UTC

Bluesprite wrote:wikipedia has been insufficient


Correction: Wikipedia knows EVERYTHING.

Wiki

The picture there does a pretty good job, I think.

The general idea is that we don't know for sure, but we suspect that some problems in NP will be solvable by quantum computers, but NP-complete problems won't be.
As long as I am alive and well I will continue to feel strongly about prose style, to love the surface of the earth, and to take pleasure in scraps of useless information.

~ George Orwell

EvanED
Posts: 4331
Joined: Mon Aug 07, 2006 6:28 am UTC
Location: Madison, WI
Contact:

Postby EvanED » Tue Mar 27, 2007 5:00 am UTC

Now, see, this is all very weird, because I've been under the impression that "if we can get quantum computers to work, we can factor large integers quickly" and break RSA, etc.

Is this just flat out wrong? It certainly seems to be...

User avatar
skeptical scientist
closed-minded spiritualist
Posts: 6142
Joined: Tue Nov 28, 2006 6:09 am UTC
Location: San Francisco

Postby skeptical scientist » Tue Mar 27, 2007 5:06 am UTC

Bluesprite wrote:Question re: quantum computing.

I've done only a limited amount of research into the theory of computation, and wikipedia has been insufficient, so I thought maybe someone here might be able to answer.

Is it the case that NP-Complete problems will be tractable under quantum computers?

You said that a quantum machine is a probabalistic turing machine, which I assume is distinct from both deterministic and non-deterministic turing machines?

Well, if P=NP, then the answer to this is obviously yes (since in that case they're tractable on classical computers), but there is no known algorithm to solve NP complete problems in QP time. So the answer is that we don't know. There is a known quantum algorithm for finding prime factorizations in polytime (finding prime factorizations, of course, is in NP but is probably not NP-complete), as you are probably already aware. But if you throw away the structure of NP complete problems treat them as brute force searches with a polytime verification algorithm, all you get from the known quantum algorithms is square-root speed up; i.e. a problem that would take O(2^n) time to brute force search with a classical algorithm takes O(2^(n/2)) time to brute force search with a quantum algorithm. So if brute force searching is the best that can be done, NP-complete problems still can't be solved in polynomial time on quantum computers. Of course cleverer methods may still be found.
I'm looking forward to the day when the SNES emulator on my computer works by emulating the elementary particles in an actual, physical box with Nintendo stamped on the side.

"With math, all things are possible." —Rebecca Watson

EvanED
Posts: 4331
Joined: Mon Aug 07, 2006 6:28 am UTC
Location: Madison, WI
Contact:

Postby EvanED » Tue Mar 27, 2007 6:28 am UTC

skeptical scientist wrote:There is a known quantum algorithm for finding prime factorizations in polytime (finding prime factorizations, of course, is in NP but is probably not NP-complete), as you are probably already aware.


Ahhhh, that explains where my misconception lies. I thought that factorization was NP-complete for some reason.

User avatar
RealGrouchy
Nobody Misses Me As Much As Meaux.
Posts: 6704
Joined: Thu May 18, 2006 7:17 am UTC
Location: Ottawa, Ontario, Canada
Contact:

Postby RealGrouchy » Tue Mar 27, 2007 11:33 pm UTC

It is quite likely that some day there will not be sufficient economic and/or scientific interest to continue research into higher density processors. However, I doubt that such a time is within the foreseeable future.

- RG>
Jack Saladin wrote:etc., lock'd
Mighty Jalapeno wrote:At least he has the decency to REMOVE THE GAP BETWEEN HIS QUOTES....
Sungura wrote:I don't really miss him. At all. He was pretty grouchy.

User avatar
skeptical scientist
closed-minded spiritualist
Posts: 6142
Joined: Tue Nov 28, 2006 6:09 am UTC
Location: San Francisco

Postby skeptical scientist » Wed Mar 28, 2007 8:45 am UTC

RealGrouchy wrote:It is quite likely that some day there will not be sufficient economic and/or scientific interest to continue research into higher density processors. However, I doubt that such a time is within the foreseeable future.

- RG>

Unless someone comes up with unbreakable cryptography (i.e. quantum cryptography) there will always be a war between those wanting to hide information, and those wanting to uncover it, and raw processing power will always be an important tool in that war. Similarly, there problems that can be solved by computers given enough time. For example, given enough time, a computer can prove any provable statement in any formal language - if you have enough processing power, computers are viable theorem-proving machines. Or running simulations - the more processing power you have, the more accurate the simulations you can run in a reasonable amount of time. There is no reason (short of hypercomputation, which is probably impossible) to think that people would ever lose interest in increased processing power.
I'm looking forward to the day when the SNES emulator on my computer works by emulating the elementary particles in an actual, physical box with Nintendo stamped on the side.

"With math, all things are possible." —Rebecca Watson

User avatar
adlaiff6
Posts: 274
Joined: Fri Nov 10, 2006 6:08 am UTC
Location: Wouldn't you rather know how fast I'm going?
Contact:

Postby adlaiff6 » Wed Mar 28, 2007 1:47 pm UTC

In any case, back to Moore's law (for my post at least)...

We have transistors down to about 20 atoms wide. At some point, there is going to be a limit as to how small they can get. "The" law, as I remember, refers to the doubling of the number of transistors on some size chip. Therefore, yes, it will end. Processing power, however (ignoring non-deterministic computers for a moment), will continue to increase, with better understanding of parallel processing. We are about to hit a point in time where, to do anything meaningful, one will need to know a lot about parallel processing, and be able to write efficient code for it. According to rumor, parallel code is extremely hard to write, so good luck, guys.

User avatar
Yakk
Poster with most posts but no title.
Posts: 11129
Joined: Sat Jan 27, 2007 7:27 pm UTC
Location: E pur si muove

Postby Yakk » Wed Mar 28, 2007 2:45 pm UTC

If moore's law continues, the forseeable future extends to about 20 to 40 years.

Beyond that, computers will be significantly smarter than humans (ie, capable of physically simulating 10 billion human brains faster-than-real-time).

(This is based off of the complexity of simulating smaller parts of the brain to a degree of accuracy sufficient to replicate functional equivilence, and moore's law).

User avatar
gmalivuk
GNU Terry Pratchett
Posts: 26836
Joined: Wed Feb 28, 2007 6:02 pm UTC
Location: Here and There
Contact:

Postby gmalivuk » Wed Mar 28, 2007 5:13 pm UTC

skeptical scientist wrote:Unless someone comes up with unbreakable cryptography (i.e. quantum cryptography) there will always be a war between those wanting to hide information, and those wanting to uncover it, and raw processing power will always be an important tool in that war.


It was my understanding that the technology already exists for quantum encryption via (relatively short) fiber optics lines. The current cost is prohibitive for widespread use, but it wouldn't surprise me in the least if, say, the White House and Pentagon already exchange one-time keys this way.

Another barrier is that an eavesdropper tapping the bitstream, though unable to actually get the accurate message, does garble it in a random way by eavesdropping, thus preventing the accurate transmission of any data.
Unless stated otherwise, I do not care whether a statement, by itself, constitutes a persuasive political argument. I care whether it's true.
---
If this post has math that doesn't work for you, use TeX the World for Firefox or Chrome

(he/him/his)

User avatar
skeptical scientist
closed-minded spiritualist
Posts: 6142
Joined: Tue Nov 28, 2006 6:09 am UTC
Location: San Francisco

Postby skeptical scientist » Wed Mar 28, 2007 8:53 pm UTC

gmalivuk wrote:It was my understanding that the technology already exists for quantum encryption via (relatively short) fiber optics lines. The current cost is prohibitive for widespread use, but it wouldn't surprise me in the least if, say, the White House and Pentagon already exchange one-time keys this way.

So it does. I was misremembering what I had read.

Another barrier is that an eavesdropper tapping the bitstream, though unable to actually get the accurate message, does garble it in a random way by eavesdropping, thus preventing the accurate transmission of any data.

There may be schemes to get around this, but it's at least true for the incarnation of quantum crypto that I've seen. In any case, it means that for all practical purposes, RSA is still better than quantum crypto, so processing power is still king for cryptography, and even with quantum computers, while processing power is no longer needed to crack RSA in reasonable amounts of time, it would be needed to solve NP-complete problems in reasonable time with the best algorithms presently known.
I'm looking forward to the day when the SNES emulator on my computer works by emulating the elementary particles in an actual, physical box with Nintendo stamped on the side.

"With math, all things are possible." —Rebecca Watson

User avatar
Yakk
Poster with most posts but no title.
Posts: 11129
Joined: Sat Jan 27, 2007 7:27 pm UTC
Location: E pur si muove

Postby Yakk » Wed Mar 28, 2007 9:37 pm UTC

Evesdropping on the bitstream is harder than just cutting the wire.

RSA is also stopped when you cut wires.

The disadvantage of QM crypto is that your wire has to be really special, while under RSA-like PK crypto, your wire can be any information conduit.

User avatar
skeptical scientist
closed-minded spiritualist
Posts: 6142
Joined: Tue Nov 28, 2006 6:09 am UTC
Location: San Francisco

Postby skeptical scientist » Thu Mar 29, 2007 11:14 am UTC

Yakk wrote:Evesdropping on the bitstream is harder than just cutting the wire.

RSA is also stopped when you cut wires.

The disadvantage of QM crypto is that your wire has to be really special, while under RSA-like PK crypto, your wire can be any information conduit.

Right, which is why cutting the wires is actually possible when it comes to quantum crypto, but not for RSA - if I'm in London and you're in New York, nothing can stop us from communicating securely via RSA other than 1) some way of breaking RSA or 2) some way of cutting off internet access to New York or London. I think we can generally accept that 2 is impossible, so the only way of stopping people from communicating securely via RSA is to break it. Part of what makes RSA so powerful is that there is no need for private communication to ever take place in order to obtain secure communication. Quantum crypto, on the other hand, does need private communication to take place - the communication that produces the one-time pad. The advantage of quantum is that the only information that needs to be transmitted privately is, in and of itself, random useless information, and can automatically detect whether there was an eavesdropper. So it's much easier to stop people from communicating via quantum crypto than via RSA, but theoretically impossible to eavesdrop on quantum crypto, while only practically impossible to eavesdrop on RSA communication. "Practically impossible" means that you either need to spend an absurdly long time (or get absurdly lucky when guessing the prime factors) factoring a large semiprime, (which can be made long enough that it generally won't matter if the code is broken), you need to discover an efficient way of factoring numbers on a classical computer (which probably doesn't exist), or you need to build a quantum computer (which is definitely within the realm of possibility, and could already be in the possession of some organizations, but is not something that your average corporate spy, criminal, or even third world government will have access to).
I'm looking forward to the day when the SNES emulator on my computer works by emulating the elementary particles in an actual, physical box with Nintendo stamped on the side.

"With math, all things are possible." —Rebecca Watson

mgoldb2
Posts: 8
Joined: Mon Apr 02, 2007 1:14 pm UTC

Postby mgoldb2 » Mon Apr 02, 2007 2:53 pm UTC

As far as I know, no one has proposed a method for data storage which takes advantage of quantum states.


Actually from what I understand the optical base quantum computer's does take advantage of quantum states.

I have some academic articles on the subject but don’t feel like digging them up at moment but here just a quick article on the subject I googled.

http://www.physorg.com/news11087.html

User avatar
entropomorphic
Posts: 23
Joined: Mon Apr 02, 2007 2:51 pm UTC

Postby entropomorphic » Mon Apr 02, 2007 4:06 pm UTC

There are lots of ways we can continue increasing computing power density even after transistor density has maxed out...

While quantum computing probably won't be suitable for general-purpose use in the near future, mainstream computers might adopt quantum coprocessors to solve very hard problems in constant time.

In very simplified terms, microprocessors are made up of logic gates, and logic gates are made up of transistors. As far as I understand, most fabrication methods use repetitive forms throughout the processor, to minimize manufacturing cost. It's quite possible that logic gates, ALUs, registers, and other processor components larger than a transistor could be developed that use less space, even when we can't shrink individual transistors any smaller.

There is a very real overhead in terms of chip complexity to support large and complex instruction sets like we use today. Even though the "RISC revolution" fizzled in the face of Intel and AMD, it's quite possible that optimizations on the instruction set level would enable more efficient use of die real estate and at least enable a higher degree of parallelism. In the same vein, software compilers could use the extended instructions more effectively, and even programming languages might be improved to make it easier to take advantage of SIMD hardware boosts like MMX.

Anyway, my point is that computers will probably continue to get faster in terms of measurable computational power, via numerous other technological innovations, even if the single measure of transistor density can't be improved indefinitely.

themuffinking
Posts: 52
Joined: Sat Mar 10, 2007 4:42 am UTC

Postby themuffinking » Tue Apr 03, 2007 1:54 am UTC

Well, eventually, someone will come up with a more modular computer - just keep adding processors for more speed. I hope. That'd be freaking awesome - as long as it doesn't become pokemon. Dear God, I don't want to have to collect and trade processors.

starkruzr
Posts: 45
Joined: Mon Dec 18, 2006 7:35 am UTC
Contact:

Postby starkruzr » Sun Apr 08, 2007 1:39 am UTC

There is a lot of research going on right now (mine included) that is concerned with getting past the fundamental CMOS device size/power limit. Anyone else here ever heard of QCA (quantum-dot cellular automata)?
+++ATH0

User avatar
Strilanc
Posts: 646
Joined: Fri Dec 08, 2006 7:18 am UTC

Postby Strilanc » Thu Apr 19, 2007 7:57 am UTC

themuffinking wrote:Well, eventually, someone will come up with a more modular computer - just keep adding processors for more speed. I hope. That'd be freaking awesome - as long as it doesn't become pokemon. Dear God, I don't want to have to collect and trade processors.


You can't just keep adding CPUs: they need to to interact somehow. The more you add, the more overhead is involved in the interaction.
Don't pay attention to this signature, it's contradictory.

User avatar
LE4dGOLEM
is unique......wait, no!!!!
Posts: 5972
Joined: Thu Oct 12, 2006 7:10 pm UTC
Location: :uoıʇɐɔol

Postby LE4dGOLEM » Thu Apr 19, 2007 6:56 pm UTC

Alky wrote:
themuffinking wrote:Well, eventually, someone will come up with a more modular computer - just keep adding processors for more speed. I hope. That'd be freaking awesome - as long as it doesn't become pokemon. Dear God, I don't want to have to collect and trade processors.


You can't just keep adding CPUs: they need to to interact somehow. The more you add, the more overhead is involved in the interaction.


CPUs to deal with the overhead!
Image Une See Fights - crayon super-ish hero webcomic!
doogly wrote:It would just be much better if it were not shitty.

User avatar
skeptical scientist
closed-minded spiritualist
Posts: 6142
Joined: Tue Nov 28, 2006 6:09 am UTC
Location: San Francisco

Postby skeptical scientist » Thu Apr 19, 2007 9:58 pm UTC

Assuming moore's law continues indefinitely, how long would it take before we would have to build a Matrioshka brain?

Also, if computing power goes as O(2^t), and NP hard problems can be solved in O(2^n) computing time, doesn't that mean NP hard problems can be solved in O(n) time, and hence P=NP?
I'm looking forward to the day when the SNES emulator on my computer works by emulating the elementary particles in an actual, physical box with Nintendo stamped on the side.

"With math, all things are possible." —Rebecca Watson

Rysto
Posts: 1460
Joined: Wed Mar 21, 2007 4:07 am UTC

Postby Rysto » Thu Apr 19, 2007 10:36 pm UTC

Also, if computing power goes as O(2^t), and NP hard problems can be solved in O(2^n) computing time, doesn't that mean NP hard problems can be solved in O(n) time, and hence P=NP?

No. The P=?NP problem has to be solved on a particular model of computation, not a continuously improving model of computation.

User avatar
FiddleMath
Posts: 245
Joined: Wed Oct 11, 2006 7:46 am UTC
Location: Madison, WI
Contact:

Postby FiddleMath » Fri Apr 20, 2007 6:53 am UTC

Rysto wrote:No. The P=?NP problem has to be solved on a particular model of computation, not a continuously improving model of computation.


Right - when we increase the speed of a machine, we want to be able to solve, in the same amount of time, rather bigger instances of problems than we could previously. If NP-complete problems cannot be solved with algorithms better than trial-and-error, then doubling the speed of your computer means you can increase the size of a problem instance by 1.

You think the rate of technology is fast now, wait until someone shows the P = NP. I mean, it won't actually happen. But if it did... hoo boy. (Unless, of course, their proof is totally nonconstructive. That would kick a different set of anthills...)

User avatar
Yakk
Poster with most posts but no title.
Posts: 11129
Joined: Sat Jan 27, 2007 7:27 pm UTC
Location: E pur si muove

Postby Yakk » Fri Apr 20, 2007 2:21 pm UTC

skeptical scientist wrote:Assuming moore's law continues indefinitely, how long would it take before we would have to build a Matrioshka brain?


Unit of computation: CPS: one Change in a bit Per Second.

Currently, we have computers that run over 10^16 CPS.

1 kg of matter runs at about 10^50 CPS. (rough volume of calculations required to "simulate" itself)

The mass of the solar system, outside of the sun, is mostly in jupiter. Assume we disassemble jupiter and turn it into a computation engine.

10^27 kg * 10^50 CPS/kg = 10^77 CPS.

10^77 CPS/10^16 CPS = 10^61 increase in computational power.

At a doubling of computational power every 2 years, it takes 20 years to consume a 10^3 increase. So it would take about 400 years, at current moore's law rates, to saturate the mass of Jupiter with perfectly efficient computational structures.

At this point one would start on the sun, which is 10^30 kg. By moore's law, it would only take about 20 years to go from Jupiter to consuming the entire sun.

It starts getting harder to find more matter to turn into computational tools. But by that point, the solar-mind is smart, and trying to guess what kind of technology or science is has availiable would be difficult.

And heck, that is 42 decades off.

Note that current human civilization brains are running at an aggregate of around 10^29 CPS.

Workaphobia
Posts: 121
Joined: Thu Jan 25, 2007 12:21 am UTC

Postby Workaphobia » Mon May 07, 2007 8:30 am UTC

FiddleMath wrote:
Rysto wrote:No. The P=?NP problem has to be solved on a particular model of computation, not a continuously improving model of computation.


Right - when we increase the speed of a machine, we want to be able to solve, in the same amount of time, rather bigger instances of problems than we could previously. If NP-complete problems cannot be solved with algorithms better than trial-and-error, then doubling the speed of your computer means you can increase the size of a problem instance by 1.


Not necessarily by 1, but a constant. Anyway, I think what he was getting at was that if your problem size increases linearly with start time, then the time it takes to solve the problem is also linear. So I could start a computation five years from now of size N, and another one ten years from now of size 2N, and so on, and their running times will all be proportional. Check my math on this, I might've done something stupid.

User avatar
adlaiff6
Posts: 274
Joined: Fri Nov 10, 2006 6:08 am UTC
Location: Wouldn't you rather know how fast I'm going?
Contact:

Postby adlaiff6 » Tue May 08, 2007 6:25 am UTC

skeptical scientist wrote:Also, if computing power goes as O(2^t), and NP hard problems can be solved in O(2^n) computing time, doesn't that mean NP hard problems can be solved in O(n) time, and hence P=NP?


Well, every 18 months, we'll be able to solve a TSP problem with one extra node....

User avatar
blob
Posts: 350
Joined: Thu Apr 05, 2007 8:19 pm UTC

Postby blob » Sat May 19, 2007 6:20 pm UTC

skeptical scientist wrote:Assuming moore's law continues indefinitely, how long would it take before we would have to build a Matrioshka brain?

The aliens in Independence Day would have thrashed us with one of those.

adlaiff6 wrote:Well, every 18 months, we'll be able to solve a TSP problem with one extra node....

So, if you take the whole of human civilization as a model of computation, it is sort of (not provably, unless Moore's law is provable) P ... but with a horribly big constant factor?
Avatar yoinked from Inverloch.

"Unless ... unless they kill us, then animate our corpses as dead zombies to fight for them. Then I suppose they've taken our lives, AND our freedom."
- Elan, OOTS 421.

User avatar
adlaiff6
Posts: 274
Joined: Fri Nov 10, 2006 6:08 am UTC
Location: Wouldn't you rather know how fast I'm going?
Contact:

Postby adlaiff6 » Sun May 20, 2007 8:16 am UTC

blob wrote:So, if you take the whole of human civilization as a model of computation, it is sort of (not provably, unless Moore's law is provable) P ... but with a horribly big constant factor?

Please explain yourself. It seems like you're trying to say something silly, yet profound.
3.14159265... wrote:What about quantization? we DO live in a integer world?

crp wrote:oh, i thought you meant the entire funtion was f(n) = (-1)^n
i's like girls u crazy


Return to “Computer Science”

Who is online

Users browsing this forum: No registered users and 3 guests