Math

For the serious discussion of weighty matters and worldly issues. No off-topic posts allowed.

Moderators: Azrael, Moderators General, Prelates

Shadowfish
Posts: 309
Joined: Tue Feb 27, 2007 2:27 am UTC

Math

Postby Shadowfish » Fri Mar 09, 2007 4:17 pm UTC

You may call me dumb, but I'm willing to risk that. Basically, what I am wondering is this: Is math real? For example, in the dark ages, did the Arabs discover Algebra, or did they invent it? Or, and alternative way to phrase the question, if we met very smart aliens, would we recognize the math that they use?

My instinct is to say that we would, beacuse so many cultures on earth independantly invented/discovered the same math. For example, Pascal's triangle was invented/discovered independantly in both Europe and China.

But, this scares me a little bit. I am sure that if we met smart aliens, they would be familiar with Newton's laws, Relativity, and Maxwell's equations, becuase these are all part of the physical universe. But why would we expect smart aliens to know calculus or algebra or geometry?

User avatar
Belial
A terrible sound heard from a distance
Posts: 30448
Joined: Sat Apr 15, 2006 4:04 am UTC
Contact:

Postby Belial » Fri Mar 09, 2007 4:24 pm UTC

Or, and alternative way to phrase the question, if we met very smart aliens, would we recognize the math that they use?


I'll leave it to the math geeks to explain *why*, because I'm bad at putting it into words, but the answer here is "pretty much".

Of course, we describe math in words and symbols, so of course, we'd have to figure out the words and symbols they use, and how they equate, but the concepts and relations would be the same.

Also, there would be the little issue of the aliens *probably* not using base-10 notation like we do (I still remember having to do base-12 and base-8 arithmetic in the "Rama" videogame...eugh), but that's neither here nor there.
addams wrote:A drunk neighbor is better than a sober Belial.


They/them

User avatar
3.14159265...
Irrational (?)
Posts: 2413
Joined: Thu Jan 18, 2007 12:05 am UTC
Location: Ajax, Canada

Postby 3.14159265... » Fri Mar 09, 2007 6:02 pm UTC

How would you arrive at maxwell's equations and relativity, and even easier at newton's laws. Say universal gravitation, without being able to prove that you can consider only the centerpoints of a spherical object (the earth) to be pulling on the center of another spherical object (The moon) without vector calculus.

I don't think you can.
My highschool physics teacher once said, there 3 symbols in math, (plus sign, equal sign, the limit). Everything else can be written as combinations of those. But you need THOSE!.

So aliens can have WACKY math. Like they only deal in modulo 31, since thats all they need on their planet since there are 2 tribes with 31 ppl in it and the people are symmetric etc. you can come up with the situation. Would we be able to understand it. YES.

Strictly speaking, what do you define as math?
"The best times in life are the ones when you can genuinely add a "Bwa" to your "ha""- Chris Hastings

User avatar
Nomic
Posts: 554
Joined: Sun Oct 29, 2006 12:29 pm UTC
Location: Gibbering in the corner

Postby Nomic » Fri Mar 09, 2007 6:37 pm UTC

Laws of psychics and such as well as sevral matematical formulas (pi is always 3.14etc....) are the same everywhere in the universe, but ofcourse aliens would probably have thier own system for markign them. But would that really be the same thing, then?

User avatar
Belial
A terrible sound heard from a distance
Posts: 30448
Joined: Sat Apr 15, 2006 4:04 am UTC
Contact:

Postby Belial » Fri Mar 09, 2007 6:42 pm UTC

Yes. Just like egyptian math in hieroglyphics was still math.

Or Mayan, for that matter.
addams wrote:A drunk neighbor is better than a sober Belial.


They/them

User avatar
Yakk
Poster with most posts but no title.
Posts: 11077
Joined: Sat Jan 27, 2007 7:27 pm UTC
Location: E pur si muove

Postby Yakk » Fri Mar 09, 2007 6:47 pm UTC

Math is an attempt by people to make themselves smarter.

To see what I mean, lets start with something obvious.

There is a black stone, and a white stone. If I pick up the black stone, and you pick up one of the two stones afterwards, what colour stone do you pick up?

You can use math to solve this, or you can just say "duh, the white stone".

The math proof, done properly, would require a boatload of notation. Using such a math proof would be considered silly to most people, and even most mathematicians.

What math lets us do is solve really really hard problems that are much like the above. It is a bunch of tried and tested rules that describe the pattern of information you can deduce from simple facts or knowledge.

Math is a lever on knowledge.

With really early math, the problems solved where often very concrete. "Whose farm is larger", and hence should be taxed more, is one of the earliest classes of mathematical problems out there.

Now, I could see the existance of really alien beings who are way smarter than us in certain areas, and dumber in others. They might have something like mathematics, but it might not be recognizable.

An alien who was sufficiently smart might have a mathematical notation that is so far beyond our understanding, and for whom our notation doesn't seem to express anything that isn't patently obvious without any notation.

miles01110
Posts: 611
Joined: Tue Oct 31, 2006 3:39 pm UTC

Postby miles01110 » Fri Mar 09, 2007 6:52 pm UTC

Yakk wrote:There is a black stone, and a white stone. If I pick up the black stone, and you pick up one of the two stones afterwards, what colour stone do you pick up?

You can use math to solve this, or you can just say "duh, the white stone".

The math proof, done properly, would require a boatload of notation. Using such a math proof would be considered silly to most people, and even most mathematicians.


Uh, how would this be complex? A proof by contradiction is 2 lines.

Answer: I pick up the white stone.
Proof: Suppose I pick up the black stone.
Contradiction, there is no black stone to pick up.

???

User avatar
Yakk
Poster with most posts but no title.
Posts: 11077
Joined: Sat Jan 27, 2007 7:27 pm UTC
Location: E pur si muove

Postby Yakk » Fri Mar 09, 2007 7:23 pm UTC

Do it formally. Have you ever seen a formal math proof?

The goal is that each step could be checked by a simple automaton -- algorithmically. Your proof is a bit longer than "duh, the white stone", because it contains some sketch of a formal proof.

To prove it, you might start with the axioms of set theory, describe the set, and then prove that a set containing two distinct elements in which one is removed, when you select an element from that set, you get the second element.

Ie:
Answer: I pick up the white stone.
Proof: Suppose I pick up the black stone.
Contradiction, there is no black stone to pick up.


Prove that the contradiction exists. There is a "duh" contradiction, but which axioms and derivation rules where used, and in what order, to generate both X and ~X to be true when you assume you picked up the black stone.

My point is, a sufficiently intelligent being when given fermat's last theorem might say "duh". They might view building up the framework and proof to solve fermat's last theorem to be as silly as building all of formal set theory to solve the "white and black stone" problem.

And if the being is sufficiently smart, what they define as a "simple automaton" might be much smarter than we do. So their individual deductive steps might be massively complex rules, their axioms might look strange to us, and they might be interested in areas of math that we have never seen.

And that is only at the medium-wierd end of the scale.

miles01110
Posts: 611
Joined: Tue Oct 31, 2006 3:39 pm UTC

Postby miles01110 » Fri Mar 09, 2007 7:42 pm UTC

Yakk wrote:Do it formally. Have you ever seen a formal math proof?


Yes I have actually, I'm a graduate physics student. However, I am compounded by what you might call the "duh" proof on your black-and-white stone problem. Enlighten me?

Shadowfish
Posts: 309
Joined: Tue Feb 27, 2007 2:27 am UTC

Postby Shadowfish » Fri Mar 09, 2007 7:55 pm UTC

Strictly speaking, what do you define as math?

For now, I think a good definition might be "The basic rules of logic, anything that they imply about how to deal with numbers, and the symbols used to indicate them".

Obviosly, the Egiptians and the Mayans had different symbols. I am wondering if the rules of logic are also arbitrary, and if they are not, as they seem not to be, what it is about the universe that insures that smart aliens must come up with the same rules that we use.

I also think that language, in general, is arbitrary. That is, the physical things that it talks about exist, and the symbols that refer to the these physical things are physically real, but the way that symbols are assigned meaning is arbitrary. Math seems to be a language, but the things that it talks about are not physical. What is it that makes math absolute?
How would you arrive at maxwell's equations and relativity, and even easier at newton's laws. Say universal gravitation, without being able to prove that you can consider only the centerpoints of a spherical object (the earth) to be pulling on the center of another spherical object (The moon) without vector calculus.

Like you said, the aliens might have wacky math. I think this is the most likely possibility. But, it might not be the only possibility.

Say that the smart aliens had a language, which our mathematicians cannot explain in terms of our math. Is it possible that after physicists learned this language, they would recognize that it can also describe Newtonian gravity, Maxwell's equations, and so on? I don't feel like this should be possible, for the reason that your said. Still, just because I can't imagine how to find Newtonian gravity without vector calculus, that is not enough to prove to me that it is impossible.

User avatar
FiddleMath
Posts: 245
Joined: Wed Oct 11, 2006 7:46 am UTC
Location: Madison, WI
Contact:

Re: Math

Postby FiddleMath » Fri Mar 09, 2007 8:01 pm UTC

Shadowfish wrote:Is math real?


Probably not.

Well, not except in the trivial sense; I mean, people are definitely affected by mathematics, and our society uses mathematics as an initial lever. (Think: money, calendars, time, cooking measures, engineering, etc.) To get at what I think you're trying to ask, allow me to rephrase:

"Does math exist outside of the consciousness of intelligent beings?"

And while you could still point to textbooks and measures and stuff, I'd say no. Yes, the normal mathematics models many real-world situations very well. Integer addition and subtraction, multiplication and division seem to operate well on discrete objects; continuous measures work well, etcetera. But these are models, not the things themselves.

Consider counting clouds. Often, clouds are in separable bunches. You can easily enumerate them like this; you can say that there are 5 clouds in the bunch to the right, and 3 clouds in the bunch to the left, and they might stay that way for a couple of hours. If you're on a largely uniform plain at high noon, you could look at clouds, close your eyes and spin in circles, and then reorient yourself nearly correctly based on the quantities of the clouds in each direction. But suppose the two bunches eventually blow into each other. Are there now 8 clouds in one bunch? Probably not; they intermingled and joined together. I hear you: "Well, duh, you're using a stupidly wrong model." Yes, I know, that's the point: you can actually get useful results from a stupidly wrong model.

You get even better results from a fairly accurate model. For example, consider Newtonian physics. We know that Newtonian physics works only at "low" velocities and on "large" scales, and that relativity and quantum mechanics are both better models of the real world at their appropriate scales. Still, Newtonian physics is still an excellent model for the mechanics of things in your daily life. It's wrong, yes, but only wrong in ways that don't change much on a daily basis, outside of those technologies that make explicit use of other models of physics in their design.

So, mathematics is the study of conceptual modeling systems. Many of these systems have no immediately apparent use in the real world. For example, I know of some technological applications for Galois theory (hint: it's in your CD player), and they're quite useful in technology, but I don't know of a single natural event that is modeled by Galois theory. Such systems are "real" only insofar as they map well to reality. When you apply math to the real world, you're modeling some aspect of the world. If the mapping relation between reality and your mathematical model is inaccurate, your results will probably be inconsistent with reality even if your math is totally internally consistent. Thus, if you try to add clouds, you'll be wrong.

Math isn't what's happening; math is what we use to understand it. Mathematics is merely half of a metaphor. And yes, I can imagine an alien society having a totally different system of mathematics than we do; the process of learning each other's systems might be the greatest boon of such a contact. Of course, I'm very nearly a mathematician, and thus a little biased. :)

User avatar
Belial
A terrible sound heard from a distance
Posts: 30448
Joined: Sat Apr 15, 2006 4:04 am UTC
Contact:

Postby Belial » Fri Mar 09, 2007 8:05 pm UTC

I also think that language, in general, is arbitrary


Yes.

That is, the physical things that it talks about exist, and the symbols that refer to the these physical things are physically real, but the way that symbols are assigned meaning is arbitrary


Also yes. You nearly just quoted Ferdinand de Saussure.

Math seems to be a language, but the things that it talks about are not physical. What is it that makes math absolute?


But they are physical, they just aren't necessarily objects (though, especially in simple arithmetic, they can be). They're curves, or forces, or speeds, or directions, or units of measure. Yes, you can talk about math without talking about these things, just as you can speak english without talking about objects. You can even use language to talk about itself, as we're doing now. Same with math.


Still, just because I can't imagine how to find Newtonian gravity without vector calculus, that is not enough to prove to me that it is impossible.


Except that calculus isn't just a way to *get* to newtonian physics, it *is* newtonian physics. In its applied form, it's a way of *describing* newtonian physics. If you describe it in any other mathematical language, it's still the same concept.
addams wrote:A drunk neighbor is better than a sober Belial.


They/them

User avatar
SpitValve
Not a mod.
Posts: 5126
Joined: Tue Sep 26, 2006 9:51 am UTC
Location: Lower pork village

Postby SpitValve » Fri Mar 09, 2007 8:13 pm UTC

Nomic wrote:Laws of psychics...


lol at spelling


Anyway... as for the black/white rock thing I guess if you wanted to be more rigorous you could something like this:

Rule 1: There are two objects called rocks.
Rule 2: One rock has a property known as black
Rule 3: The other rock has a property known as white.
Rule 4: Both rocks are in initially a position called "the ground"
Rule 5: A person may "pick up" a rock if it is on "the ground". Such a rock is no longer on the ground and may not be picked up again.

Rules 1-4 are simply the initial conditions. Rule 5 is "physics" I guess you could say.

Then if I say "If I pick up the black stone and you pick up one of the two stones, what colour did you pick up?"

Answer:
There are only two rock that it is ever possible to pick up. Therefore if you are unable to pick up one rock, you must have picked the other rock.

The black rock has been picked up. By rule 5 you can't pick up a rock that has been picked up. Therefore, there is only one rock to pick up. You pick up the black rock.



Is that what you meant by mathematical rigour? Or by mathematical rigour, do you mean using greek-letter-symbols (instead of word-symbols)?

Personally I much prefer miles' version :) But that's because I'm also a physics grad student maybe...

User avatar
SpitValve
Not a mod.
Posts: 5126
Joined: Tue Sep 26, 2006 9:51 am UTC
Location: Lower pork village

Postby SpitValve » Fri Mar 09, 2007 8:16 pm UTC

Belial wrote:
Still, just because I can't imagine how to find Newtonian gravity without vector calculus, that is not enough to prove to me that it is impossible.


Except that calculus isn't just a way to *get* to newtonian physics, it *is* newtonian physics. In its applied form, it's a way of *describing* newtonian physics. If you describe it in any other mathematical language, it's still the same concept.


Yeah fully.

I can't imagine how you can have newtonian physics without some concept of "position", "change in position" and "change in change in position". And once you have position, velocity and acceleration, any mathematics describing how they interact is calculus in disguise.

User avatar
Belial
A terrible sound heard from a distance
Posts: 30448
Joined: Sat Apr 15, 2006 4:04 am UTC
Contact:

Postby Belial » Fri Mar 09, 2007 8:29 pm UTC

Yay, I said something intelligent about math and physics. I feel I've justified my presence in these fora now.

::retreats back into Liberal Arts Land::
addams wrote:A drunk neighbor is better than a sober Belial.


They/them

Shadowfish
Posts: 309
Joined: Tue Feb 27, 2007 2:27 am UTC

Postby Shadowfish » Fri Mar 09, 2007 9:19 pm UTC

I can't imagine how you can have newtonian physics without some concept of "position", "change in position" and "change in change in position". And once you have position, velocity and acceleration, any mathematics describing how they interact is calculus in disguise.


Well said. Thanks also FiddleMath and Belial, those are very good answers.

User avatar
OmenPigeon
Peddler of Gossamer Lies
Posts: 673
Joined: Mon Sep 25, 2006 6:08 am UTC
Contact:

Postby OmenPigeon » Fri Mar 09, 2007 9:35 pm UTC

Yakk wrote:The goal is that each step could be checked by a simple automaton -- algorithmically.


Actually, the concept of 'proof' is somewhat subtler than that. In principle, yes, in order to know something mathematically we should have a formal proof which we then survey and check for correctness. But thats a pretty absurd standard.

Consider the first proof of the four-color theorem. It was published in '77 or sometime thereabouts, and relied heavily on a computer program written by one of the mathematician's son. Without going into too much detail, there are three important cases in the last step of the proof, the third of which has over 1500 sub-cases to consider. The computer program found the right set of sub-cases and showed some property on them all, in something like 1200 hours of computer time. The proof is correct, but it is clearly not surveyable by a human. What we can do is prove that the program is correct, and the input given to it is correct, and that the machine functioned correctly while it was running, and thus be satisfied that the output of the program was correct. Going further, we could modify the program to create a proof written in some machine readable proof language as it ran, and then create a proof-checking program (these exist, and can be quite small and easily proven correct) to check the proof thus created. This was later done for a variant on the '77 proof.

Now consider a more common case. Turing's 1936 paper was the first (apologies to Church and Post) to formally define an algorithm. It also proved some astonishingly important things about computability. Now, since before 1936 no one had a mathematical definition of an algorithm, I think it's fair to say that any definition of proof could not have used the notion of algorithms. How, then, did Euler 'know' any of the things he proved? (More interestingly, but somewhat unrelated, how did Ramanujan 'know' anything?)

More to the point, I would wager that of all the things you know to be mathematically true, you have only seen a handful formally proved. How do you know the rest? Because formal proof is not the only kind of proof. Any argument which is convincing to a reasonable mathematician can be called a proof. But what is a reasonable mathematician? Am I one? If not, how can I know the mind of this other mathematician?

There are more things on heaven, earth and mathematics, Horatio, than are dreamt of in your formal system.
As long as I am alive and well I will continue to feel strongly about prose style, to love the surface of the earth, and to take pleasure in scraps of useless information.
~ George Orwell

User avatar
Strilanc
Posts: 646
Joined: Fri Dec 08, 2006 7:18 am UTC

Postby Strilanc » Fri Mar 09, 2007 10:09 pm UTC

I've always considered axioms to be invented, and everything else to be discovered.

As for multiple cultures using the same axioms, I would put that down to being the same species in the same general environment: the same axioms are useful.
Don't pay attention to this signature, it's contradictory.

User avatar
SpitValve
Not a mod.
Posts: 5126
Joined: Tue Sep 26, 2006 9:51 am UTC
Location: Lower pork village

Postby SpitValve » Fri Mar 09, 2007 10:22 pm UTC

Alky wrote:I've always considered axioms to be invented, and everything else to be discovered.

As for multiple cultures using the same axioms, I would put that down to being the same species in the same general environment: the same axioms are useful.


Like how sometimes creatures indepently evolve the same adaptations for the same environment... (although they may take different forms). Like wings.

User avatar
Yakk
Poster with most posts but no title.
Posts: 11077
Joined: Sat Jan 27, 2007 7:27 pm UTC
Location: E pur si muove

Postby Yakk » Sat Mar 10, 2007 12:54 am UTC

OmenPigeon wrote:Actually, the concept of 'proof' is somewhat subtler than that. In principle, yes, in order to know something mathematically we should have a formal proof which we then survey and check for correctness. But thats a pretty absurd standard.


Or, we should believe that we can reduce the problem to that state, without actually doing it (because we are lazy).

My point isn't that we can't do larger jumps than a formal proof. My point is that doing a full-scale mathematical proof for some problems is slower than just saying "duh". Practically, people do this all of the time when they write out proofs -- they leave steps out, hand-wave around problems they think are solveable, and generally skip over tedium.

This is a good thing. Now, what happens when we interact with a being whose mathematical ability differs significantly from our own? I could see them seeing a completely different set of statements as "not worth bothering to demonstrate". Gone far enough, and their mathematics would be alien and pretty damn useless to us (and vice versa).

OmenPigeon wrote:Now consider a more common case. Turing's 1936 paper was the first (apologies to Church and Post) to formally define an algorithm. It also proved some astonishingly important things about computability. Now, since before 1936 no one had a mathematical definition of an algorithm, I think it's fair to say that any definition of proof could not have used the notion of algorithms. How, then, did Euler 'know' any of the things he proved? (More interestingly, but somewhat unrelated, how did Ramanujan 'know' anything?)


Ramanujan suspected things. He was quite often wrong.

The idea that a proof could be checked by someone far dumber than the genius who wrote the proof goes back further than the existence of algorithms. Heck, the idea of algorithms goes back further than the formal description of them. :)

I am talking about a very refined version of the term "proof" -- in general, a "proof" is just a way to make a statement seem more true.

Note that many many proofs "back in the day" turned out to be wrong -- they had unstated assumptions, bad derivation rules, and inconsistent results.

You can believe/know things without proof. :)

More to the point, I would wager that of all the things you know to be mathematically true, you have only seen a handful formally proved. How do you know the rest?


I have seen very few formal proofs. Formal proofs are a beast.

I have seen informal proofs for a far larger chunk. When I've seen an informal proof, it means I have some confidence that it can be rewritten as a formal proof.

I suspect I could generate an informal proof in many many other cases.

I suspect someone could generate an informal proof in another large chunk of cases.

The rest, I just suspect are true.

Because formal proof is not the only kind of proof. Any argument which is convincing to a reasonable mathematician can be called a proof. But what is a reasonable mathematician? Am I one? If not, how can I know the mind of this other mathematician?


*nod*, which means a proof is not a huge guarantee of truth.

Part of what makes a proof convincing to me is the understanding that I could reduce it to steps that are so simple, a regular expression engine could verify them.

There are more things on heaven, earth and mathematics, Horatio, than are dreamt of in your formal system.


Who in their right mind would dream in a formal system? Ik!

Here is a really half-assed semi-formal attempt. I am attempting to reuse standard math theory, rather than make a minimal set of axioms to solve the problem -- after all, the minimium set of axioms that solve the problem are just an axiom that asserts the concusion we want. :)

Axioms:
Insert axioms of set theory here.
Z1 X is a set.
Z2 a in X is T
Z3 b in X is T
Z4 For all x, if x in X, then (x = a) or (x = b)

Theorem: If x is in X\a then x=b.
Proof
1 x in X\a is T
2 x in X is T by 1 and (definition of \ from set theory axioms)
3 not(x = a) is T by 1 and (devinition of \ from set theory axioms)
4 (x = a) or (x = b) by 2 and Z4
5 ((x = a) or (x = b)) and not(x = a) by 3 and 4
6 ((x = a) and not(x = a)) or ((x = b) and not(x = a)) by 5 and distributive law
7 (F) or ((x = b) and not (x = a)) by 6 and excluded middle
8 (x = b) and not (x = a) by 7 and ???
9 x = b by simplifying 8

Practically, that isn't as formal as it could be. A computer couldn't verify the above, for example, so it still relies on a huge amount of hidden and unmentioned assumptions about how our math system works. I would not be confident that the above proof would accurate and doesn't contain an error.

It also doesn't contain the proof of:
Theorem: If x=b then x is in X\a.

which would be needed to match up with "the rock he picks up is white".

Next, attach X to "on the ground", a to be "the black rock" and b to be "the white rock".

As you make the arguement less informal and more formal, the proof gets more and more unwieldly. However, verifying each step requires less and less intuition about the problem at hand.

So, one could imagine a being so intelligent that newtonian dynamics is no more worth formalizing than the white rock/black rock system. Or even relativity. Fermat's last theorem would be dismissed with an "obviously" -- and not because they just thought it was true, but because the problem is simple enough that anyone with half-assed training could derive the result.

That level of mathematician isn't something we could talk to very well using mathematics. Either each step in their proof would look like steps the size of "prove fermat's last theorem", or their informal proofs (which they find tedious to write out) would be complex and the size of the library of congress. The statements they are proving could involve definitions that we simply can't grasp, built on multiple levels of abtraction (many of which are as implicitly assume the ability to prove fermat's last theorem on the part of the reader) that we just don't get.

If someone walked up to you, and started laying out a formal proof about the white/black rock problem, and wasn't able to understand proofs much more complex... Talking with them mathematically would be very difficult.

And it would get worse if each of us where crippled in the areas where the other being wasn't.

User avatar
warriorness
Huge Fucking-Lazer
Posts: 1610
Joined: Thu Dec 28, 2006 10:33 am UTC
Location: CMU, Pittsburgh, PA, USA
Contact:

Postby warriorness » Sat Mar 10, 2007 4:42 am UTC

Calculus was invented, though. It's a method of evaluating stuff that's used for other stuff. Before Newton/Leibniz, we didn't have it.

Before you learned calculus, you thought "oh that's an entirely different type of math, must be an invention" etc etc, and afterwards it just seems like something natural. Who's to say it's not the same way with more elementary math? It's just a lot harder to think of it as a method of evaluating stuff, because it's so simple. It's easy to grasp how Calculus is an invention, as it's very complex. But addition? You probably knew how to add before you even realized that it's just a method of evaluating stuff. And so did everybody else, therefore it's a lot older. (EDIT: clarification: people realized it earlier, so it's been around longer)

I propose that all mathematics is an invention. We use calculus, an invention, to show that when things fall, their motion can be described by (1/2)gt^2. We use basic math to show that when you combine a set of x elements with a set of y elements, you get a set of x+y elements. What is it about basic math that makes it different from calculus in this manner?

And if you're going to point out that calculus was not invented but discovered, at least provide an argument for it.
Iluvatar wrote:Love: Gimme the frickin' API.
yy2bggggs, on Fischer Random chess wrote:Hmmm.... I wonder how how a hypermodern approach would work

User avatar
3.14159265...
Irrational (?)
Posts: 2413
Joined: Thu Jan 18, 2007 12:05 am UTC
Location: Ajax, Canada

Postby 3.14159265... » Sat Mar 10, 2007 4:44 am UTC

For clarification purposes: Would you also consider all science inventions? If not how come, since science is just our way of saying how stuff react to stuff
"The best times in life are the ones when you can genuinely add a "Bwa" to your "ha""- Chris Hastings

User avatar
Belial
A terrible sound heard from a distance
Posts: 30448
Joined: Sat Apr 15, 2006 4:04 am UTC
Contact:

Postby Belial » Sat Mar 10, 2007 5:17 am UTC

Pi makes a valid point. The method of describing the way things fall is invented, but the phenomenon being described was always there, and our notation didn't change it.

That's like saying that, because we invented our word for "rock", that rocks were invented and not discovered. You're conflating the signifier with the signified, when they're not the same.
addams wrote:A drunk neighbor is better than a sober Belial.


They/them

User avatar
OmenPigeon
Peddler of Gossamer Lies
Posts: 673
Joined: Mon Sep 25, 2006 6:08 am UTC
Contact:

Postby OmenPigeon » Sat Mar 10, 2007 5:36 am UTC

First, let me define some terms. Invention is the creation of a new thing. Something invented did not exist prior to its invention. Let us assume that earth contains the only intelligent life in the universe, since its fruitless to argue that prior to Edison's light bulb (clearly a created artifact) aliens on Tau Zeti IX had light bulbs. Discovery is the creation of new knowledge about an old thing. For example Columbus' discovery of a new continent: it was there long before 1492, but no one in Spain knew about it.

It should be clear that, under these definitions, much "natural science" is discovery: gravity existed before Newton ever thought about it; evolution acted on populations before Darwin ever visited the Galapagos. Science does not inherently involve invention. I'm not sure quite what you mean, Pi, by saying "science is just our way of saying how stuff react to stuff", but that is why I don't consider science to be all invention.

What is invention is much engineering and applied science: new ways to build bridges are inventions because the theories about how to hold a lot of mass above the ground didn't exist before before some guy figured it out. The laws of physics that allowed him to build a cool new bridge were discovered, the new bridge was invented.

Mathematics, then. The fundamental difference between mathematics and the natural sciences is that mathematical truth can be ascertained without appeal to empirical reality. If I want to verify Maxwell's equations I need to study some electricity. If I want to verify Euler's formula I need only to sit and think for a while.

This does not immediately lead to the assertion that calculus was discovered rather than invented. What it does lead to is that if you accept that elementary mathematics is a discovery rather than an invention, then you need to accept that all mathematics is discovered. You don't, however, need to accept that.

The problem is an epistemological (and, I believe, personal) one. Does mathematics exist outside of the mind of humanity? Many mathematicians (I think) believe that it does. The opposite view is completely valid, though: that all of math is merely a construction in the mind of humans to provide order in a chaotic world. The first view leads very naturally to believing that new mathematical knowledge is discovery, the second towards believing that math is invention. I have problems with both of these.

With the first my problem is this: computers are mathematical objects, the programs I write on them are therefore also mathematical objects. If all of math exists outside of myself, then whenever I write a program I am discovering a new mathematical object. However, writing a program is a creative act: except for very trivial programs I can be fairly certain that, probabilistically, the programs I write have never been written before. Moreover, it feels like invention. I know my feelings aren't a valid form of proof, but still.

My problem with the second view is one that has already been brought up in this thread: the consistency and independent creation of mathematical truth. If math exists solely in human minds, how is it that Nash, one of the more brilliant minds of our time, could independently prove so many things and still arrive at the exact same results?

I lean, for the most part, towards the discovery camp. I feel that calculus was a method of describing reality discovered by Newton and Leibniz. Functions certainly existed before them, and the idea of the slope of a curve at a point and the area under a curve existed before them. They just figured out how to work with those ideas.

A vague argument which may help clarify things: the most important distinction between a discovery and an invention for me is that an invention could be different. By this I mean that the world would still be internally consistent if Edison has built a different light bulb, but that the world would no longer be internally consistent if gravity was directly proportional to the square of the distance between two objects, and inversely proportional to their masses, everything else remaining the same. Seen in this light, mathematics must be discovery: the system would be inconsistent if calculus were any different. On the other hand, it would be perfectly consistent if my programs were altered.

Certainly no one can prove that mathematics exists in some ontological aether. But it certainly seems to me as though new mathematical knowledge has the flavor of discovery rather than invention.
As long as I am alive and well I will continue to feel strongly about prose style, to love the surface of the earth, and to take pleasure in scraps of useless information.

~ George Orwell

User avatar
warriorness
Huge Fucking-Lazer
Posts: 1610
Joined: Thu Dec 28, 2006 10:33 am UTC
Location: CMU, Pittsburgh, PA, USA
Contact:

Postby warriorness » Sat Mar 10, 2007 5:59 am UTC

Belial wrote:Pi makes a valid point. The method of describing the way things fall is invented, but the phenomenon being described was always there, and our notation didn't change it.

That's like saying that, because we invented our word for "rock", that rocks were invented and not discovered. You're conflating the signifier with the signified, when they're not the same.


But math is a method of describing something. The phenomenon occurs; math is how we analyze it.

Upon further thought, I realize that some things can't have been invented:

3.14159265... wrote:My highschool physics teacher once said, there 3 symbols in math, (plus sign, equal sign, the limit). Everything else can be written as combinations of those. But you need THOSE!.


Because addition and equality are completely fundamental, it seems that they must have been discovered. But even multiplication is derived from addition (x*y simply represents y+y+y... x times). I disagree with the assertion about the limit, though, as it can be expressed in a "simpler" (i.e. more fundamental) form: the whole epsilon-delta definition. What's a definition, though, for addition and equality?
Iluvatar wrote:Love: Gimme the frickin' API.
yy2bggggs, on Fischer Random chess wrote:Hmmm.... I wonder how how a hypermodern approach would work

User avatar
3.14159265...
Irrational (?)
Posts: 2413
Joined: Thu Jan 18, 2007 12:05 am UTC
Location: Ajax, Canada

Postby 3.14159265... » Sat Mar 10, 2007 6:09 am UTC

Ya I am just getting to that definition of limits now in class.

Also lets use a mathematical proof.

Say mathematical ideas were not there already.
This means that the value we designitated by the symbols 1 in base 10 and another 1 in base 10 make a 2 in base 10 would also not be true, unless someone said it was.
But the idea that a 1 and another 1 is always 2 of a 1 is inherintly there whether we say it or not.
Given we came up with the symbols, but that idea is definitely true.
Therefore our assumption must have been wrong that this idea was not already there.

Therefore 1+1=2 is a mathematical idea that is inherintly there.
Thus the + sign and = sign are inherintly there even if we don't designate symbols.
Thus all of mathematics utilizing + and = sign are inherintly there.

Anyone care to develope the limit, I failed, as I had a bio exam today, and 2 exams tommorow and am studying... sorta
"The best times in life are the ones when you can genuinely add a "Bwa" to your "ha""- Chris Hastings

User avatar
warriorness
Huge Fucking-Lazer
Posts: 1610
Joined: Thu Dec 28, 2006 10:33 am UTC
Location: CMU, Pittsburgh, PA, USA
Contact:

Postby warriorness » Sat Mar 10, 2007 6:21 am UTC

3.14159265... wrote:But the idea that a 1 and another 1 is always 2 of a 1 is inherintly there whether we say it or not.
Given we came up with the symbols, but that idea is definitely true.
Therefore our assumption must have been wrong that this idea was not already there.

Therefore 1+1=2 is a mathematical idea that is inherintly there.
Thus the + sign and = sign are inherintly there even if we don't designate symbols.
Thus all of mathematics utilizing + and = sign are inherintly there.


That's what I was getting at in my last post, but I was kind of vague (discussing this with a friend helped clear it up). These fundamental truths are Euclid's axioms - addition, equality, reflexivity, and the stuff about points/lines/planes/etc. Everything else in mathematics is a way of describing, analyzing those axioms, just as the Periodic Table is a way of describing and analyzing the various elements. The axioms (elements) weren't invented; they were always there. But the system of studying them, mathematics (the Periodic Table) was invented.
Iluvatar wrote:Love: Gimme the frickin' API.
yy2bggggs, on Fischer Random chess wrote:Hmmm.... I wonder how how a hypermodern approach would work

User avatar
Belial
A terrible sound heard from a distance
Posts: 30448
Joined: Sat Apr 15, 2006 4:04 am UTC
Contact:

Postby Belial » Sat Mar 10, 2007 6:26 am UTC

What you're saying is true, but it's also irrelevant. Notation can be adapted, so long as the underlying principles are the same.

Gravitation and such aren't going to change, and so long as the underlying axioms are constant, and the thing being described (physics) is constant, the way we describe them can be translated.
addams wrote:A drunk neighbor is better than a sober Belial.


They/them

User avatar
warriorness
Huge Fucking-Lazer
Posts: 1610
Joined: Thu Dec 28, 2006 10:33 am UTC
Location: CMU, Pittsburgh, PA, USA
Contact:

Postby warriorness » Sat Mar 10, 2007 6:29 am UTC

Belial wrote:What you're saying is true, but it's also irrelevant. Notation can be adapted, so long as the underlying principles are the same.

Gravitation and such aren't going to change, and so long as the underlying axioms are constant, and the thing being described (physics) is constant, the way we describe them can be translated.


To whom was this post directed?
Iluvatar wrote:Love: Gimme the frickin' API.
yy2bggggs, on Fischer Random chess wrote:Hmmm.... I wonder how how a hypermodern approach would work

User avatar
3.14159265...
Irrational (?)
Posts: 2413
Joined: Thu Jan 18, 2007 12:05 am UTC
Location: Ajax, Canada

Postby 3.14159265... » Sat Mar 10, 2007 6:33 am UTC

Exactly, so long as the ideas of mathematics are being presented, we can "translate them"

If some one said ojiba and ojiba make bojiba. and raised his long slimy green finger when saying ojiba and then raised two slimy fingers one which had an eye on it when saying bojiba. And then confirmed it when u raised one hand and said ojiba and then the other and said ojiba and then both and said bojiba.

Then you know ojiba is 1 and bojiba is 2.

It would be hard to confirm and translate.
But see givin the first two axioms of equality and addition, everything else is derived. ALOTA other stuff is derived.
But um that lines/parralell thing isn't thats why you can get entirely other mathematics which is also correct.
So first we pick axioms (+, =) prove them using 1+1=2 and then, develope everything from them.

No?
This may not be making any sense as i m reading about the self-fulfilling prophecy right now. irony
"The best times in life are the ones when you can genuinely add a "Bwa" to your "ha""- Chris Hastings

User avatar
Yakk
Poster with most posts but no title.
Posts: 11077
Joined: Sat Jan 27, 2007 7:27 pm UTC
Location: E pur si muove

Postby Yakk » Sat Mar 10, 2007 6:43 am UTC

I know multiple different "calculus" explainations and theories. There is more than one way to describe rates of change, continuity, and area. These different ways of describing "calculus" even give different results in some cases.

In one version of calculus, the lim (x->0) of 1/x exists. In another it doesn't.

In one version of calculus, the intermediate value theorem holds. In another it doesn't.

In one version of calculus, there are discontinuous real-to-real functions. In another, no such function exists.

In one version of calculus, the delta-impulse function exists. In another, it doesn't.

In one version of calculus, you can take a sphere of radius 1, break it into 5 distinct subsets, translate and rotate those 5 subsets, and end up with two spheres of radius 1 with no every single point covered. In another, you can't.

So calculus could have been different.

Even addition and subtraction could have been different. Yes, even something as simple as that.

Can you use math to describe reality without the counting numbers? Maybe. It wouldn't be the same description we came up with, and some parts would probably be more awkward.

There are multiple different sets of logic axioms you can use to base your math on. They can even lead to different results -- and in some cases, it is argueable which of the results are "more true".

There is a famous quote that goes something like "The integers really exist. Everything else, mankind invented."

User avatar
yy2bggggs
Posts: 1261
Joined: Tue Oct 17, 2006 6:42 am UTC

Postby yy2bggggs » Sat Mar 10, 2007 7:19 am UTC

My monitor is currently set to 1280x1024 pixels. The color depth supported by my video card is 32 bits. Thus, there are 2^(32*1280*1024) different images that this monitor can display. I can, in theory, create any of them using MS paint. Among these possible configurations of pictures that can be shown on the monitor, there exists random noise, all black, and a photorealistic representation of some particular archaopteryx.

So if I spend a few hours with MS paint, in 1280x1024 resolution, I could certainly come up with something that is essentially unduplicatable. This creation of mine would most definitely qualify as an invention. Still, this same piece of computer art is in fact simply one of those 2^(32*1280*1024) possible combinations.

In this context, inventions are discoveries. There's no or here.

Granted, this isn't a perfect analogy to math, but math is even more of an invention, in that there are a lot more possible mathematics that could be developed; however, it's also even more of a discovery, in that people rarely create mathematical systems "in full", knowing all of their properties (it's certainly done to cover trivial case entities, but we're mostly interested in the nontrivial).

So, my answer to the question--is math an invention or a discovery--is:
yes

User avatar
Gelsamel
Lame and emo
Posts: 8237
Joined: Thu Oct 05, 2006 10:49 am UTC
Location: Melbourne, Victoria, Australia

Postby Gelsamel » Sat Mar 10, 2007 9:41 am UTC

Depends on what you're talking about. OUR (system of) Math is certainly an invention, but math itself is not.

User avatar
Pathway
Leon Sumbitches...?
Posts: 647
Joined: Sun Oct 15, 2006 5:59 pm UTC

Postby Pathway » Sat Mar 10, 2007 10:43 am UTC

Yakk wrote:What math lets us do is solve really really hard problems that are much like the above.

Yes. But math isn't just a problem solving tool. It's an abstraction of the ideas behind concepts that we encounter. Have you ever encountered the number one in everyday life? You've encountered a single object on its own, and when you've counted objects, it's been by some form of linear combination of single objects.

The concept of the number one, however, isn't obvious to everyone at first. We learn it when we're young. The idea of "one" itself, as a distinct mathematical object, requires the leap of intuition that there is something common about the way objects work--that you can generalize your counting to an abstraction which may be easier to work with: a concept called 'number.'

People are born not even grasping that objects are there when unseen.

Mathematics is the study of the abstractions we've developed. I'd like to believe that it can exist independently of any problems we might try to use it to solve.

It is a bunch of tried and tested rules that describe the pattern of information you can deduce from simple facts or knowledge.


I'd like to disagree again. It is a set of logical deductions from certain assumptions. By fixing these axioms appropriately we can prove these 'rules'--really theorems--but we don't, however, say that they're true because they're 'tried and tested.' They're not rules, but theorems. Not tested, but proven. (Well, yes, they're tested. But tests don't mean very much. There's still tons of money offered for a proof of Riemann. It's been tested to quite high n, but that isn't enough.)

OmenPigeon: If we understood exactly how Ramanujan's mind did what he did...

I would argue that most of mathematics is discovered, based on the following:

We have axioms to describe the system in which we're working. People defined them. Similarly, people proved all of proven mathematics. But of all the logically possible, consistent mathematical systems, ours (assuming no human error) is one. So the truths we've gained--that given A, we have B, which under condition C implies either D or E, etc.--always were possibilities for the system we defined.

As soon as we defined the derivative and the integral, it made sense to prove the Fundamental Theorem of Calculus.

But, at that time, after we had the necessary definitions and framework to talk about the FTC, but before we had conceived of it in its true form, was the FTC true?

If it was, that implies that mathematics is a discovery. We are figuring out that, given certain conditions (our definitions, our axioms), something is true. But we didn't invent that truth. Any true statement is valid, it exists, before we even think about it. And when we come upon something that already exists, it's a discovery.

User avatar
Yakk
Poster with most posts but no title.
Posts: 11077
Joined: Sat Jan 27, 2007 7:27 pm UTC
Location: E pur si muove

Postby Yakk » Sat Mar 10, 2007 4:01 pm UTC

Pathway wrote:
Yakk wrote:What math lets us do is solve really really hard problems that are much like the above.

Yes. But math isn't just a problem solving tool. It's an abstraction of the ideas behind concepts that we encounter.


And how do we know if the math/abstraction is a good one or not?

I can generate a billion completely invalid abstractions of counting. In some of them, addition isn't commutative. Some of them are inconsistent. Some of them prove things "true" which are either meaningless or not true. Some of them prove things "false" which you would think should be true. Many of them are rather incomplete, and can't prove things true or untrue that one would think are either true or untrue.

They don't reflect reality in certain ways. Some are "better" at it than others. Others are "bad abstractions".

Now, the counting abstractions we have are a bunch of "pretty good abstraction"s, if I could put my opinion of them forward.

Have you ever encountered the number one in everyday life? You've encountered a single object on its own, and when you've counted objects, it's been by some form of linear combination of single objects.


Ayep. But those single objects could be quirks of how we percieve the world.

At the scale we look at the world, there are many hard-edged things. At larger and smaller scales -- less so. It would take a relatively alien mind to not see objects. It would take a less alien mind to not see them as inter-countable without decoration, or just not countable.

That abstraction -- the turning of "a particular rock" into "an abstract instance of rock" is not something that was there before it was done. The thought made the abstraction, the reality did not imply the abstraction.

The concept of the number one, however, isn't obvious to everyone at first. We learn it when we're young. The idea of "one" itself, as a distinct mathematical object, requires the leap of intuition that there is something common about the way objects work--that you can generalize your counting to an abstraction which may be easier to work with: a concept called 'number.'

People are born not even grasping that objects are there when unseen.

Mathematics is the study of the abstractions we've developed. I'd like to believe that it can exist independently of any problems we might try to use it to solve.


Those abstractions are the tools we use to solve problems -- the abstraction of number lets us talk about "one tree" and "one rock" and the features (at the appropriate level of abstraction") all "one objects" share.

It is a bunch of tried and tested rules that describe the pattern of information you can deduce from simple facts or knowledge.


I'd like to disagree again. It is a set of logical deductions from certain assumptions. By fixing these axioms appropriately we can prove these 'rules'--really theorems--but we don't, however, say that they're true because they're 'tried and tested.' They're not rules, but theorems. Not tested, but proven. (Well, yes, they're tested. But tests don't mean very much. There's still tons of money offered for a proof of Riemann. It's been tested to quite high n, but that isn't enough.)


I can invent an axiom that proves Riemann.

The sketch of the axiom is:
Axiom R: The Riemann hypothesis is true.

Now, that isn't what you mean by prove Riemann is it?

The common/main axioms of mathematics, and the axioms of logic, where chosen because they where beautiful and seemed to solve a particular class of questions/problems. They are somewhat often poked at and changed. The most common set of logical axioms we are using today is different than the ones many theologians in the middle ages used.

The accepted rules of derivation from Truth to Truth did not pre-exist a huge amount of study, they can and have been changed. The ones we use today are used widely because they have been found to be "useful" or "beautiful".

So when you throw out the useful axioms and derivation rules of truth-determining, and invent a useless one, your proofs aren't interesting or useful. The goal in proof isn't to prove something -- it is to show that the something follows from simpler statements we have found to be useful, interesting and true.

But, at that time, after we had the necessary definitions and framework to talk about the FTC, but before we had conceived of it in its true form, was the FTC true?


One could demonstrate that there could be a valid proof chain that proved FTC, given modern axioms and proof rules.

If your definition of true is "there can be a valid proof chain", then it was true (modulo a really wierd universe).

If your definition of true is "there is a valid proof chain", then it wasn't true yet.

If your definition of true is something else...

If it was, that implies that mathematics is a discovery. We are figuring out that, given certain conditions (our definitions, our axioms), something is true. But we didn't invent that truth. Any true statement is valid, it exists, before we even think about it. And when we come upon something that already exists, it's a discovery.


So I ask you, what do you mean by "true"?

Take the intermediate value theorem. Is it true? If you have nice, smooth function, that is at one point above zero, and at another below zero, and stays in the real numbers, does it cross zero?

The classic IVT doesn't produce the x such that f(x)=0 or in any way guarantee that you can find the x -- but under conventional mathematical logic, there is a proof that says that x exists. It is an assertion of existance: but because it doesn't describe how to get it, it isn't all that hard to make a theory of calculus that describes the same real-world phenominia, yet doesn't agree!

There are other calculus theories in which the intermediate value theorem doesn't hold -- if you have a continuous function f where f(0)=-1 and f(1)=1, there is no guarantee that you can find an x such that f(x)=0. In the one I'm thinking of, for any rational epsilon > 0, you can find an x such that -epsilon < f(x) < epsilon -- but the derivation rules that allow you to go from there to an actual f(x) that equals zero have been removed from the theory.

In a sub-theorey, you even can't add them back in. You can add the axiom "all real to real functions are continuous", which is consistent and useful, but rules out some of the "truths" of standard calculus.

...

My point is, we pick our axioms and derivation rules and what we mean by them. We hook them up to reality in an attempt to abstract it, and from that abstraction draw knowledge. But the reality itself did not demand any particular abstraction to describe it.

Mankind invented the abstractions of mathematics.

User avatar
LE4dGOLEM
is unique......wait, no!!!!
Posts: 5972
Joined: Thu Oct 12, 2006 7:10 pm UTC
Location: :uoıʇɐɔol

Postby LE4dGOLEM » Sat Mar 10, 2007 4:17 pm UTC

2s = b + w

(b+w) - b = w

2s - b = w

s = w

s - w = 0s


???


EDIT:

Every system thinks they are base-10. People that only ever counted in binary would think they count in base 10, because that's what "10" means. Base-8 goes 1,2,3,4,5,6,7,10,11,12,13,14,15,16,17,20... and if they don't have a single symbol for 8 and 9, they don't exist to them. We use Base-10 (1,2,3,4,5,6,7,8,9,10) because (Arguably) we (normally) have ten toes (being sub-appendages. Fingers, apparently, don't count for numerical systems, although usually (usually) sub appendages on fore-appendages are of the same quantity of rear-appendage sub-appendages). If an "alien" were to have, say, 14 toes/fingers/whatever (by our counting), they'd still think they used base-10, even if their numerical system went 1,2,3,4,5,6,7,8,9,♥,♦,♣,♠,10


DOUBLE EDIT: I think that's enough Serious Business for me
Image Une See Fights - crayon super-ish hero webcomic!
doogly wrote:It would just be much better if it were not shitty.

User avatar
yy2bggggs
Posts: 1261
Joined: Tue Oct 17, 2006 6:42 am UTC

Postby yy2bggggs » Sat Mar 10, 2007 4:30 pm UTC

LE4dGOLEM wrote:Every system thinks they are base-10.

Only positional number systems using 0.

Think Roman numerals, and alphabet based numbering systems.

User avatar
3.14159265...
Irrational (?)
Posts: 2413
Joined: Thu Jan 18, 2007 12:05 am UTC
Location: Ajax, Canada

Postby 3.14159265... » Sat Mar 10, 2007 4:42 pm UTC

Why is that in this argument?
And I would say, there won't be some1 that would respond to YAKK
"The best times in life are the ones when you can genuinely add a "Bwa" to your "ha""- Chris Hastings

User avatar
Yakk
Poster with most posts but no title.
Posts: 11077
Joined: Sat Jan 27, 2007 7:27 pm UTC
Location: E pur si muove

Postby Yakk » Sat Mar 10, 2007 6:51 pm UTC

My favourate base:
9,699,690

It has the wonderful property that for all numbers up to 20, 1/x is a terminating decimal!

Or the prime-product number-base system. The first digit (before and after the decimal) is base 2, the second base 2*3, the third 2*3*5, the forth 2*3*5*7, the fifth 2*3*5*7*11, etc. All rational numbers are terminating decimals in this system. :)

So 100 is:
100 mod 2 = 0
100/2 = 50
50 mod 2*3 = 2
(50-2)/6 = 8
8 mod 2*3*5 = 8
(8-8)/30 = 0

So, 100 is encoded as {8,2,0}

And, written in itself:
{{{1,0},0},0},{1,0},0

heh.

User avatar
SpitValve
Not a mod.
Posts: 5126
Joined: Tue Sep 26, 2006 9:51 am UTC
Location: Lower pork village

Postby SpitValve » Sat Mar 10, 2007 8:02 pm UTC

LE4dGOLEM wrote:2s = b + w

(b+w) - b = w

2s - b = w

s = w

s - w = 0s


???


??? exactly

s = (b+w)/2

So you've written

b+w = b+w
b+w - b = w
b+w - b = w
so (miraculously) (b+w)/2 = w
and (b+w)/2-w=0

Could you clarify the sigficance, or is it just a mistake?


Return to “Serious Business”

Who is online

Users browsing this forum: No registered users and 6 guests