## Something I've been wondering since 7th grade

**Moderators:** gmalivuk, Moderators General, Prelates

### Something I've been wondering since 7th grade

I guess it was 6.5 years ago, in a boring class in 7th grade, I came up with this idea/question that I've never heard of since I thought of it. My question was like this: If x*y is x plus itself y times, and x^y is x times itself y times, is there something in the other direction? Some sort of operation, when done on x y times gives you x+y. Anything that fits the pattern.

I've been trying to be patient and to wait until I learn enough to answer the question myself but I thought I might as well ask here.

I've been trying to be patient and to wait until I learn enough to answer the question myself but I thought I might as well ask here.

### Re: Something I've been wondering since 7th grade

saus wrote:I guess it was 6.5 years ago, in a boring class in 7th grade, I came up with this idea/question that I've never heard of since I thought of it. My question was like this: If x*y is x plus itself y times, and x^y is x times itself y times, is there something in the other direction? Some sort of operation, when done on x y times gives you x+y. Anything that fits the pattern.

I've been trying to be patient and to wait until I learn enough to answer the question myself but I thought I might as well ask here.

Sure, we can define such an operation if we want.

Define an operator [imath]S(y)(\cdot)[/imath] as [imath]y*(\cdot)[/imath].

Then [imath]S(y)(\frac{x}{y}+1)[/imath] gives you what you're looking for. I suppose to answer your question, no, there is no natural or common operator that does such a thing. Of course, the things we use as operations are merely notational conventions to portray action on something. Since what you're looking for is already handled by an operator, namely if you want x+y you just add x and y, there has been no need to further define such an animal.

### Re: Something I've been wondering since 7th grade

saus wrote:Some sort of operation, when done on x y times gives you x+y. Anything that fits the pattern.

"+1"

x + y = x + 1

_{1}+ 1

_{2}+ 1

_{3}+ ... + 1

_{y}

This signature is Y2K compliant.

Last updated 6/29/108

Last updated 6/29/108

### Re: Something I've been wondering since 7th grade

Doing "+1" on x y times gets you "+y", but I think you mean plugging x into both sides of a binomial operator, so x$x$x...(y times) equals x + y. I suppose if we want to be cheap we could define "a$b" as "a + b^0" (or a +1). Doing that y times equals x + y.

GENERATION 4: The first time you see this, copy it into your sig on any forum and add 1 to the generation. Social experiment.

### Re: Something I've been wondering since 7th grade

Yes I meant more what anthemyst said rather than jmorgan3's thing.

The idea is I could take any two numbers and, using anthemysts notation, I could find what number 4$7 is equal to. 4$4 would be 4+2 = 6. (like 4+4=4*2, 4*4=4^2)

I don't know any of the technical definitions of operations or anything, so I can only explain it in this conceptual way. I think it's possible that this thing doesn't exist. Please ask me if I'm still being too vague about my question.

The idea is I could take any two numbers and, using anthemysts notation, I could find what number 4$7 is equal to. 4$4 would be 4+2 = 6. (like 4+4=4*2, 4*4=4^2)

I don't know any of the technical definitions of operations or anything, so I can only explain it in this conceptual way. I think it's possible that this thing doesn't exist. Please ask me if I'm still being too vague about my question.

### Re: Something I've been wondering since 7th grade

There's a case to be made that multiplication is not repeated addition. (http://www.maa.org/devlin/devlin_06_08.html). I'm honestly not sure what to make of it yet, but the point he makes is that just because you come up with the same result multiplying and adding doesn't mean that you did the same thing to get there. That would explain why exactly your intuition that there should be an operation "@" that gives us x@x@...@x

_{(y times)}= x+y breaks down.SargeZT wrote:Oh dear no, I love penguins. They're my favorite animal ever besides cows.

The reason I would kill penguins would be, no one ever, ever fucking kills penguins.

### Re: Something I've been wondering since 7th grade

Pluto all over again.

That's a good lead.. for now I think this will return to the back of my mind. It's good to know that it's out there somewhere to learn and fix this loophole in my understanding.

That's a good lead.. for now I think this will return to the back of my mind. It's good to know that it's out there somewhere to learn and fix this loophole in my understanding.

- phlip
- Restorer of Worlds
**Posts:**7572**Joined:**Sat Sep 23, 2006 3:56 am UTC**Location:**Australia-
**Contact:**

### Re: Something I've been wondering since 7th grade

The main thing that threw me off whenever I used to think about this topic, is: if you define some operator @ such that "x@x@x@...@x" with y x's is x+y, then how do you represent "x+1" in terms of @? Or "x+0"?

For multiplication, exponentiation, tetration... and all those other operators which can be seen as repeated applications of the previous operator, the identity is 1 (x*1 = x^1 = x^^1 = ... = x), and 0 gives a constant (x*0 = 0, x^0 = x^^0 = ... = 1), but with addition, 0 is the identity, and no value gives a constant result.

For multiplication, exponentiation, tetration... and all those other operators which can be seen as repeated applications of the previous operator, the identity is 1 (x*1 = x^1 = x^^1 = ... = x), and 0 gives a constant (x*0 = 0, x^0 = x^^0 = ... = 1), but with addition, 0 is the identity, and no value gives a constant result.

Code: Select all

`enum ಠ_ಠ {°□°╰=1, °Д°╰, ಠ益ಠ╰};`

void ┻━┻︵╰(ಠ_ಠ ⚠) {exit((int)⚠);}

### Re: Something I've been wondering since 7th grade

Pathway wrote:There's a case to be made that multiplication is not repeated addition. (http://www.maa.org/devlin/devlin_06_08.html). I'm honestly not sure what to make of it yet, but the point he makes is that just because you come up with the same result multiplying and adding doesn't mean that you did the same thing to get there. That would explain why exactly your intuition that there should be an operation "@" that gives us x@x@...@x_{(y times)}= x+y breaks down.

The Peano definition of multiplication is repeated addition. Multiplication of fractions can be done by repeated addition and division. But multiplication of fractions is a different form of multiplication, it takes different arguments than multiplication of integers.

I wonder how Dr. Devlin suggests multiplication should be introduced to young pupils. Giving them the little multiplication table and tell them to cram? (Because that doesn't seem arbitrary.) Draw rectangles of dots? (Repeated addition, only thinly veiled.)

- gmalivuk
- GNU Terry Pratchett
**Posts:**26765**Joined:**Wed Feb 28, 2007 6:02 pm UTC**Location:**Here and There-
**Contact:**

### Re: Something I've been wondering since 7th grade

Yeah, addition is in some sense more basic, and then the others are extensions derived by repeating the previous ones.

### Re: Something I've been wondering since 7th grade

I've wondered about this topic as well, but perhaps from the other end. I've always assumed that addition was the most basic, and multiplication and exponentiation were built on that. But why are addition and multiplication commutative while exponentiation is not?

A gentle answer turns away wrath, but a harsh word stirs up anger.

### Re: Something I've been wondering since 7th grade

You have to prove that addition and multiplication are commutative and associative and then you'll see that they're really not simple things.

In remember when I was told that multiplication was commutative when I was 8, I found it weird... and I was right, it's nothing trivial.

Rearranging 5 bags of 4 apples into 4 bags of 5 apples is a lot of work, and so is the proof of commutativity of *.

Suppose you define x $ y with 0 $ y = f(y) and (s x) $ y = g(y,x$y). (where s is successor)

In order to prove commutativity you need to show that in those 2 rules you can switch the arguments of $ :

Proving x $ 0 = f(x) (by induction on x) requires proving g(0,f(x)) = f(s x)

Proving x $ (s y) = g(x,x$y) (also by induction on x) requires proving

f(s y) = g(0,f(y)) (same as just above) and

g(s y, g(x, z)) = g(s x, g(y, z))

for +, f(x) = x and g(y,z) = s z, they come down to s x = s x and s s z = s s z.

for *, f(x) = 0 and g(y,z) = y+z, they come down to 0 = 0+0 and (s y)+(x+z) = (s x)+(y+z) which is a bit harder but doable.

if you want to try commutativity of ^, you can't prove any of those 2 things.

we have f(x) = 1 and g(y,z) = y*z, so they come down to 1 = 0*1 and (s y)*(x*z) = (s x)*(y*z) (and they're both false in general).

picking f(x) = 0 and g(y,z) = y*z works, but the function obtained is not really interesting...

Assuming g is commutative and associative (which is not the case for + btw), the 2nd equation simplifies to g(g(s x,y),z) = g(g(x,s y),z).

Suppose we want g(s x,y) = g(x,s y) (g would be weird otherwise).

Then, by repeated application of this rule, you must have g(x , y) = g(0 , x+y) = g(x+y , 0) : g can only depend on the sum of its arguments.

So let's replace g(x,y) with g(x+y).

The associativity of g says that g(x+g(y+z)) = g(y+g(z+x)) :

Then, g(x+g(y)) = g(g(x+y)), so g almost "commutes" with addition

If x+g(y) = g(x+y), then g(x) can only be x+g(0).

Now the 1st equation says f(s x) = g(f(x)) :

for every x, f(x) = g applied x times to f(0).

So f only depends on g and the image of 0.

Eventually you can show that x$y = g applied (x+y) times to (x*y + f(0)), which is trivially commutative under this form.

And I'm not looking into associativity...

In remember when I was told that multiplication was commutative when I was 8, I found it weird... and I was right, it's nothing trivial.

Rearranging 5 bags of 4 apples into 4 bags of 5 apples is a lot of work, and so is the proof of commutativity of *.

Suppose you define x $ y with 0 $ y = f(y) and (s x) $ y = g(y,x$y). (where s is successor)

In order to prove commutativity you need to show that in those 2 rules you can switch the arguments of $ :

Proving x $ 0 = f(x) (by induction on x) requires proving g(0,f(x)) = f(s x)

Proving x $ (s y) = g(x,x$y) (also by induction on x) requires proving

f(s y) = g(0,f(y)) (same as just above) and

g(s y, g(x, z)) = g(s x, g(y, z))

for +, f(x) = x and g(y,z) = s z, they come down to s x = s x and s s z = s s z.

for *, f(x) = 0 and g(y,z) = y+z, they come down to 0 = 0+0 and (s y)+(x+z) = (s x)+(y+z) which is a bit harder but doable.

if you want to try commutativity of ^, you can't prove any of those 2 things.

we have f(x) = 1 and g(y,z) = y*z, so they come down to 1 = 0*1 and (s y)*(x*z) = (s x)*(y*z) (and they're both false in general).

picking f(x) = 0 and g(y,z) = y*z works, but the function obtained is not really interesting...

Assuming g is commutative and associative (which is not the case for + btw), the 2nd equation simplifies to g(g(s x,y),z) = g(g(x,s y),z).

Suppose we want g(s x,y) = g(x,s y) (g would be weird otherwise).

Then, by repeated application of this rule, you must have g(x , y) = g(0 , x+y) = g(x+y , 0) : g can only depend on the sum of its arguments.

So let's replace g(x,y) with g(x+y).

The associativity of g says that g(x+g(y+z)) = g(y+g(z+x)) :

Then, g(x+g(y)) = g(g(x+y)), so g almost "commutes" with addition

If x+g(y) = g(x+y), then g(x) can only be x+g(0).

Now the 1st equation says f(s x) = g(f(x)) :

for every x, f(x) = g applied x times to f(0).

So f only depends on g and the image of 0.

Eventually you can show that x$y = g applied (x+y) times to (x*y + f(0)), which is trivially commutative under this form.

And I'm not looking into associativity...

### Re: Something I've been wondering since 7th grade

Taking the logarithm gives you the behavior you'd like (with the usual caveat about what sort of numbers you can take the log of):

ln(xy) = ln(x) + ln(y).

it is a very useful property.

ln(xy) = ln(x) + ln(y).

it is a very useful property.

### Re: Something I've been wondering since 7th grade

saus wrote:I guess it was 6.5 years ago, in a boring class in 7th grade, I came up with this idea/question that I've never heard of since I thought of it. My question was like this: If x*y is x plus itself y times, and x^y is x times itself y times, is there something in the other direction? Some sort of operation, when done on x y times gives you x+y. Anything that fits the pattern.

I've been trying to be patient and to wait until I learn enough to answer the question myself but I thought I might as well ask here.

x+y, if y=1

but seriously, there are no traditional mathematical operators that would achieve this. you do however have a nice set of written operators to chose from thanks to our friends here.

Philwelch wrote:Would a prostitution enthusiast be a buy-sexual?

...sorry.

- Yakk
- Poster with most posts but no title.
**Posts:**11128**Joined:**Sat Jan 27, 2007 7:27 pm UTC**Location:**E pur si muove

### Re: Something I've been wondering since 7th grade

gmalivuk wrote:Yeah, addition is in some sense more basic, and then the others are extensions derived by repeating the previous ones.

Let succ(X) be the successor of X.

Then X+0 is defined to be X.

X + succ(Y) is defined to be succ(X+Y)

This results in ... a definition of addition that is defined in terms of repeated use of the succ() function.

Now, you might say "but succ is a single-argument function, while + and * are two-argument functions!" -- possibly that is because * is missing an argument!

Let's examine a three-argument multiplication. *(a,b,c) = (a)+(a+c)+(a+c+c)+(a+c+c+c)...+(a+c+...+c), b times repeated.

Note that *(a,0,c) = 0

Note that *(a,b,0) = a+a+a...+a "b times" (standard multiplication)

*(a,b,c)+*(a,b,c) = *(a,b,0)+*(a,b,0) + *(0, b, c)

*(0,b,c) = *(c, *(b, b-1, 0), 0) for b>=1

*(a,b,c) = *(a-*(c,b,0), b, -c)

Now let's go to the next level. ^(a,b,c) = a*(a+c)*(a+c+c)*...*(a+c+...+c) b times over.

^(a,a,-1) = a! = ^(1,a,1)

And the !! operator also falls out -- a!! = ^(a,a/2 round down, -2) = ^(a%2, a/2 round down, 2).

All of this is almost certainly just junk, but it is fun to play with. Ie: what if it isn't that succ() has too few arguments, but rather than * and ^ etc have too few arguments?

One of the painful things about our time is that those who feel certainty are stupid, and those with any imagination and understanding are filled with doubt and indecision - BR

Last edited by JHVH on Fri Oct 23, 4004 BCE 6:17 pm, edited 6 times in total.

Last edited by JHVH on Fri Oct 23, 4004 BCE 6:17 pm, edited 6 times in total.

- Torn Apart By Dingos
**Posts:**817**Joined:**Thu Aug 03, 2006 2:27 am UTC

### Re: Something I've been wondering since 7th grade

The interesting question here, in my opinion, is how should x$y be defined so that x+y=x$x$...$x (repeated y times), and $ gets some of the nice properties that addition, multiplication and exponentiation has (associativity, commutativity, continuity?).

- phlip
- Restorer of Worlds
**Posts:**7572**Joined:**Sat Sep 23, 2006 3:56 am UTC**Location:**Australia-
**Contact:**

### Re: Something I've been wondering since 7th grade

Torn Apart By Dingos wrote:[...] exponentiation has [...] associativity, commutativity [...]

What type of crazy exponentiation are you using?

Code: Select all

`enum ಠ_ಠ {°□°╰=1, °Д°╰, ಠ益ಠ╰};`

void ┻━┻︵╰(ಠ_ಠ ⚠) {exit((int)⚠);}

### Re: Something I've been wondering since 7th grade

division is the most basic, multiplication can just be thought of as division backwards

- Torn Apart By Dingos
**Posts:**817**Joined:**Thu Aug 03, 2006 2:27 am UTC

### Re: Something I've been wondering since 7th grade

phlip wrote:Torn Apart By Dingos wrote:[...] exponentiation has [...] associativity, commutativity [...]

What type of crazy exponentiation are you using?

Way to butcher that quote, you might have a future in politics. It also isn't continuous. I said "some of the nice properties", but I agree it was badly phrased.

- phlip
- Restorer of Worlds
**Posts:**7572**Joined:**Sat Sep 23, 2006 3:56 am UTC**Location:**Australia-
**Contact:**

### Re: Something I've been wondering since 7th grade

Hmm? I didn't think it was out of context at all... you said that +, * and ^ are associative, commutative and continuous... which is only true for two of the three.

Anyways, enough tangent, more successor function.

I don't think we can define a x$y that has any of the nice properties you listed, for the main reason that you can't make the property x+y=x$x$..$x (with y repetitions) make sense when y=1.

Anyways, enough tangent, more successor function.

I don't think we can define a x$y that has any of the nice properties you listed, for the main reason that you can't make the property x+y=x$x$..$x (with y repetitions) make sense when y=1.

Code: Select all

`enum ಠ_ಠ {°□°╰=1, °Д°╰, ಠ益ಠ╰};`

void ┻━┻︵╰(ಠ_ಠ ⚠) {exit((int)⚠);}

### Re: Something I've been wondering since 7th grade

We can extend the problem. Essentially we have a sequence of functions identified by positive integers.

f(1)(x, y) == x+y

f(2)(x, y) == x*y

f(3)(x, y) == x^y

...

You want to run that backwards, which trying to keep as many basic properties of the sequence as you can. But there's no need to stop there. Might as well look at what f(1/2) is, or f(pi), or f(1+i), or f(1+e) [dual numbers, e*e = 0], or ...

f(1)(x, y) == x+y

f(2)(x, y) == x*y

f(3)(x, y) == x^y

...

You want to run that backwards, which trying to keep as many basic properties of the sequence as you can. But there's no need to stop there. Might as well look at what f(1/2) is, or f(pi), or f(1+i), or f(1+e) [dual numbers, e*e = 0], or ...

Don't pay attention to this signature, it's contradictory.

- Torn Apart By Dingos
**Posts:**817**Joined:**Thu Aug 03, 2006 2:27 am UTC

### Re: Something I've been wondering since 7th grade

Right. y+1 repetitions then. So x=x+0, x$x=x+1, x$x$x=x+2, and so on.phlip wrote:I don't think we can define a x$y that has any of the nice properties you listed, for the main reason that you can't make the property x+y=x$x$..$x (with y repetitions) make sense when y=1.

EDIT:

Let x$x:=x+1. Then addition is repetition of this operator in the sense that x+y=x$x$...$x (with y $'s) when y=1,2,3,....

$ can't have either a left or right identity, because e$e!=e for all e.

EDIT 2: Deleted stupid stuff. Man, I suck.

Last edited by Torn Apart By Dingos on Thu Aug 14, 2008 10:37 am UTC, edited 5 times in total.

- phlip
- Restorer of Worlds
**Posts:**7572**Joined:**Sat Sep 23, 2006 3:56 am UTC**Location:**Australia-
**Contact:**

### Re: Something I've been wondering since 7th grade

I don't think associativity would be possible:

x+3 = x$x$x$x = (x$x)$(x$x) = (x+1)$(x+1) = (x+1)+1 = x+2

(Or, if we don't use Torn's adaption, we can have:

x+6 = x$x$x$x$x$x = (x$x)$(x$x)$(x$x) = (x+2)$(x+2)$(x+2) = (x+2)+3 = x+5

which also doesn't work.)

But x$y = max(x,y)+1 satisfies commutativity... and satisfies x+y=x$x$..$x (with y dollar-signs). It's also continuous... though not differentiable at x=y (C

If we want commutativity (but not associativity), then we need that x$y = S(max(x,y)) (where S is the successor function) at least for when x and y are natural numbers... simply because (x+(y-1))$x has to equal x+y. There's probably interpolations other than max(x,y)+1 that satisfies this, though, maybe even some differentiable-everywhere ones...

[edit]

x$y := (x+y+2)/2 doesn't work:

x$x$x = ((x+x+2)/2)$x = (x+1)$x = ((x+1)+x+2)/2 = x+1.5 ≠ x+2

x+3 = x$x$x$x = (x$x)$(x$x) = (x+1)$(x+1) = (x+1)+1 = x+2

(Or, if we don't use Torn's adaption, we can have:

x+6 = x$x$x$x$x$x = (x$x)$(x$x)$(x$x) = (x+2)$(x+2)$(x+2) = (x+2)+3 = x+5

which also doesn't work.)

But x$y = max(x,y)+1 satisfies commutativity... and satisfies x+y=x$x$..$x (with y dollar-signs). It's also continuous... though not differentiable at x=y (C

^{∞}elsewhere).If we want commutativity (but not associativity), then we need that x$y = S(max(x,y)) (where S is the successor function) at least for when x and y are natural numbers... simply because (x+(y-1))$x has to equal x+y. There's probably interpolations other than max(x,y)+1 that satisfies this, though, maybe even some differentiable-everywhere ones...

[edit]

x$y := (x+y+2)/2 doesn't work:

x$x$x = ((x+x+2)/2)$x = (x+1)$x = ((x+1)+x+2)/2 = x+1.5 ≠ x+2

Code: Select all

`enum ಠ_ಠ {°□°╰=1, °Д°╰, ಠ益ಠ╰};`

void ┻━┻︵╰(ಠ_ಠ ⚠) {exit((int)⚠);}

### Who is online

Users browsing this forum: No registered users and 9 guests