I saw this on the AOPS forum and so far no one has given a satisfactory answer (i.e. more than just the set of solutions). So I was wondering if the people here at xkcd could help

[math](a-c)-(b-d) = 1[/math]

[math](a^2 - c^2)+(b^2-d^2) = 3[/math]

[math](a^3 - c^3)-(b^3 - d^3) = -5[/math]

[math](a^4 - c^4)+(b^4 - d^4) = 15[/math]

To me, it seems like there would be a really clever solution to this system of equations.

## System of Equations with 4 variables

**Moderators:** gmalivuk, Moderators General, Prelates

### Re: System of Equations with 4 variables

kcaze wrote:To me, it seems like there would be a really clever solution to this system of equations.

What makes you say that? I mean that the set up of the equations themselves looks pretty, but the answer for each doesn't seem to follow an order...

### Re: System of Equations with 4 variables

masher wrote:What makes you say that? I mean that the set up of the equations themselves looks pretty, but the answer for each doesn't seem to follow an order...

To be honest, nothing really besides the fact that the equations look really simple. I've never dealt with anything beyond simultaneous equations of order 2 in school so I don't really know how to approach this.

### Re: System of Equations with 4 variables

By "clever solution" do you mean that the values of a-d are cool/weird/nice, or that the method of solving the equations is clever?

Because the former would be cool, whereas the latter is just an application of linear algebra...

Because the former would be cool, whereas the latter is just an application of linear algebra...

### Re: System of Equations with 4 variables

I was referring to the method actually. Could you show me how you would solve this using linear algebra?

### Re: System of Equations with 4 variables

The equations are nonlinear in the variables, so you cannot use linear algebra to solve this system.

Problems of this sort are nothing more than root finding methods. In other words, if you subtract the right hand side from both sides of each equation, what you get is a system such as

[math]\begin{array}{ccc}

f_1(\mathbf{x}) & = & 0 \\

f_2(\mathbf{x}) & = & 0 \\

f_3(\mathbf{x}) & = & 0 \\

f_4(\mathbf{x}) & = & 0

\end{array}[/math]

or, in vector notation

[math]\mathbf{f}(\mathbf{x}) = \mathbf{0}[/math]

If [imath]\mathbf{f}(\mathbf{x})[/imath] is linear, then you can do various things to solve for it, or, at least, the constraints specifying whether a solution exists or is unique are very straightforward. However, here you have a nonlinear system, so we cannot use linear algebra to solve for it. Although some nonlinear systems can be analytically solved and various tools exist, they are usually ad hoc and apply only to specific families of problems. I haven't done any searching, or tried to break this problem down at all, but I don't know offhand of any such solution.

The reason why people only posted the family of solutions is because problems like this are typically solved numerically. Since this is just a root finding problem, it is easy to apply a root finding algorithm to it. Newton's Method, which is a fairly straightforward iterative method, can be generalized to n dimensions by utilizing the jacobian of [imath]\mathbf{f}(\mathbf{x})[/imath]. The Wikipedia page has a comprehensive review of the topic: http://en.wikipedia.org/wiki/Newton%27s ... _equations

Problems of this sort are nothing more than root finding methods. In other words, if you subtract the right hand side from both sides of each equation, what you get is a system such as

[math]\begin{array}{ccc}

f_1(\mathbf{x}) & = & 0 \\

f_2(\mathbf{x}) & = & 0 \\

f_3(\mathbf{x}) & = & 0 \\

f_4(\mathbf{x}) & = & 0

\end{array}[/math]

or, in vector notation

[math]\mathbf{f}(\mathbf{x}) = \mathbf{0}[/math]

If [imath]\mathbf{f}(\mathbf{x})[/imath] is linear, then you can do various things to solve for it, or, at least, the constraints specifying whether a solution exists or is unique are very straightforward. However, here you have a nonlinear system, so we cannot use linear algebra to solve for it. Although some nonlinear systems can be analytically solved and various tools exist, they are usually ad hoc and apply only to specific families of problems. I haven't done any searching, or tried to break this problem down at all, but I don't know offhand of any such solution.

The reason why people only posted the family of solutions is because problems like this are typically solved numerically. Since this is just a root finding problem, it is easy to apply a root finding algorithm to it. Newton's Method, which is a fairly straightforward iterative method, can be generalized to n dimensions by utilizing the jacobian of [imath]\mathbf{f}(\mathbf{x})[/imath]. The Wikipedia page has a comprehensive review of the topic: http://en.wikipedia.org/wiki/Newton%27s ... _equations

### Re: System of Equations with 4 variables

I feel like I've seen something like this in the past where there was a clever way of solving it, on one of those math competition things or something along those lines.

I don't know if it was exactly the same, and so what i saw may have had specially chosen constants to make things work out nicer, but I seem to recall some nonlinear system with a bunch of differences of squares that you can do some trickery of multiplying one by another, and discovering 'oh, this contains the same terms as this other equation, which we know what equals!' and that sort of thing.

Of course, there are more general sorts of methods to solve that sort of thing, but depending on the context of where you saw it, I could see how there 'might' be a clever solution one could produce by hand within a reasonable amount of time. That said though, if it's just a general problem, there may not be anything more to it than using one of those numerical methods.

I don't know if it was exactly the same, and so what i saw may have had specially chosen constants to make things work out nicer, but I seem to recall some nonlinear system with a bunch of differences of squares that you can do some trickery of multiplying one by another, and discovering 'oh, this contains the same terms as this other equation, which we know what equals!' and that sort of thing.

Of course, there are more general sorts of methods to solve that sort of thing, but depending on the context of where you saw it, I could see how there 'might' be a clever solution one could produce by hand within a reasonable amount of time. That said though, if it's just a general problem, there may not be anything more to it than using one of those numerical methods.

- eta oin shrdlu
**Posts:**451**Joined:**Sat Jan 19, 2008 4:25 am UTC

### Re: System of Equations with 4 variables

I don't see a very clever solution, but you can solve this exactly just using algebra. I started by rewriting the equations:[math]\begin{eqnarray*}

a-b &= c-d+1 \\

a^2+b^2 &= c^2+d^2+3 \\

a^3-b^3 &= c^3-d^3-5 \\

a^4+b^4 &= c^4+d^4+15

\end{eqnarray*}[/math]Spoilered solution outline:

a-b &= c-d+1 \\

a^2+b^2 &= c^2+d^2+3 \\

a^3-b^3 &= c^3-d^3-5 \\

a^4+b^4 &= c^4+d^4+15

\end{eqnarray*}[/math]Spoilered solution outline:

**Spoiler:**

### Re: System of Equations with 4 variables

Eta, that seems extremely tedious but at least it doesn't use any techniques I'm unfamiliar with. In general, is it possible to solve any system (no matter what powers it contains) with a unique solution(s) using simple algebra manipulations?

Gorcee, I've just learned about Newton's method in calculus but all the information about generalizing it to n dimensions seems over my head. But are these root-finding algorithms what a calculator usually uses when you tell it to find the zero of a function? On my TI-83 plus, you have to specify a left bound and a right bound which I think indicates that it's doing something similar to Newton's method because you have to start relatively near the zero to work.

Ugh, is this what real analysis is? It seems ugly and tedious to me.

Gorcee, I've just learned about Newton's method in calculus but all the information about generalizing it to n dimensions seems over my head. But are these root-finding algorithms what a calculator usually uses when you tell it to find the zero of a function? On my TI-83 plus, you have to specify a left bound and a right bound which I think indicates that it's doing something similar to Newton's method because you have to start relatively near the zero to work.

Ugh, is this what real analysis is? It seems ugly and tedious to me.

### Re: System of Equations with 4 variables

kcaze wrote:Eta, that seems extremely tedious but at least it doesn't use any techniques I'm unfamiliar with. In general, is it possible to solve any system (no matter what powers it contains) with a unique solution(s) using simple algebra manipulations?

In general, no. In fact, even in one variable, polynomials above degree 4 are not generally algebraically solvable (ie, there's no equivalent to the quadratic formula for anything >= 5). The proof of this lies in something called Galois theory, and is pretty advanced.

Also, not every system is guaranteed to have a solution, and if a solution exists, there is no guarantee that it is unique.

kcaze wrote:Gorcee, I've just learned about Newton's method in calculus but all the information about generalizing it to n dimensions seems over my head. But are these root-finding algorithms what a calculator usually uses when you tell it to find the zero of a function? On my TI-83 plus, you have to specify a left bound and a right bound which I think indicates that it's doing something similar to Newton's method because you have to start relatively near the zero to work.

Essentially, yes. Newton's method is really easy to describe. Take a function, and pick a point "near" a root, call it x0. Find the slope of the function at that point, and draw it down to the X-axis (ie, draw the tangent line to the function until it intersects the x axis). Now you have a new value of x, call it x1. Find the value of the function at x1, find the slope, follow it down as before, giving you x2. Continue until the difference between x{n} and x{n+1} is very small. Then your root is at approximately x{n+1}.

In higher dimensions, you do the same thing, except you follow the steepest-descent gradient. A function in 2 variables gives you a surface. That means that instead of a tangent line at any (x,y) point on the surface, you have an infinite set of tangent lines, which happens to constitute a tangent plane. Follow that down to the (x,y) point where z = 0, and repeat.

kcaze wrote:Ugh, is this what real analysis is? It seems ugly and tedious to me.

No. Real analysis can be much more tedious, but much more interesting. It's kind of like this: the algebra that you're doing now is kind of like driving a car. You need to know some fundamentals on how a car works: the accelerator makes you go, the brakes make you stop, and you need to shift. As you become a more experienced driver, you begin to tell if your brakes don't feel right, or if your steering is wobbly.

As you go onto more advanced math, it's like learning more about how a car works. The brake pedal doesn't simply apply the brakes. Instead, it changes the pressure in a hydraulic system that is augmented by vacuum pressure generated by the engine. This causes the hydraulic rams in your brake calipers to press inwards and grab your rotors. It is very tedious to understand all this, sure, but can be very interesting and useful in the end. Eventually, you'll get to the point (if you desire) where you can feel your brakes and understand that they feel squishy because you're losing hydraulic pressure, and that you should check the integrity of the air hoses to the brake booster.

Real analysis is about understanding the mechanics of numbers; specifically, the reals. You are given a fundamental set of basic rules, and you re-construct how algebra, calculus, etc. work. It is pedantic, but that is because you have to be very precise w/r.t. definitions. It's also very tough, but very beautiful.

### Re: System of Equations with 4 variables

kcaze wrote:Eta, that seems extremely tedious but at least it doesn't use any techniques I'm unfamiliar with. In general, is it possible to solve any system (no matter what powers it contains) with a unique solution(s) using simple algebra manipulations?

Generally, no. It's worth mentioning that it isn't clear that this is an unique solution anyway. It looks to me that it will produce quadratic equations in a and b, so the solution will be multivalued in that sense. It's entirely possible that a different solution strategy might fall into a different solution set for c and d as well. Proving uniqueness in non-linear systems is generally very difficult except in very, very, special cases. And even still, there is no guarantee that, even if an unique solution is possible in principle, that it can be solved algebriacally. Even very simple 1 variable non-linear equations such as [imath]x = e^x[/imath] cannot be solved algebriacally.

kcaze wrote:Gorcee, I've just learned about Newton's method in calculus but all the information about generalizing it to n dimensions seems over my head. But are these root-finding algorithms what a calculator usually uses when you tell it to find the zero of a function? On my TI-83 plus, you have to specify a left bound and a right bound which I think indicates that it's doing something similar to Newton's method because you have to start relatively near the zero to work.

Ugh, is this what real analysis is? It seems ugly and tedious to me.

Real analysis is essentially using huge amounts of time and effort to prove results that will seem obvious to a grade schooler.

### Who is online

Users browsing this forum: No registered users and 11 guests