##
Problem promoted *December 5, 2008*

*Posted by David Speyer in Uncategorized.*

trackback

trackback

At the end of my last post I posed a problem to which I don’t know the solution, and which am very curious to see how our readers might tackle. It occurs to me that, if I want people to think about a problem, I shouldn’t put it at the end of a long post, below the fold. So

**Problem **Let and be polynomials in two variables such that and exist. Show that the determinant

is not a nonzero constant. (Equivalently, show that it has a zero for some .)

One warning: the zero is not always on the hyperbola . For example, if then the determinant is

,

which does not vanish on the hyperbola.

## Comments

Sorry comments are closed for this entry

David, I didn’t want you thinking that nobody took an interest in your problem. In fact, I’ve spent way too much time thinking about it.

My line of attack is write f(x,y)=f_0(y)+f_1(y)x+…+f_m(y)x^m and write g(x,y)=g_0(y)+g_1(y)x+…+g_n(y)x^n similarly. Using the limit condition, we know y|f_m(y) and y|g_n(y). We may as well assume n is minimal, and both f_m and g_n are monic.

Then, using the Jacobian condition we get m*f_m*g_n’ – n*g_n*f_m’=0. Simplifying and using the argument principle, we arrive at f_m=c(g_n)^{m/n}, and since these are both monic we get c=1. Now, we can replace g by g-f if we like, so we see that n<m in any case. At this point we must have y^2|y^(m/n)|f_m. Thus y|f_{m-1}, by the limit condition.

etc…

Anyway, I think I eventually dealt with the n=1 case, but it got too complicated after that.

Is the limiting condition on f(x,y) equivalent to the naive condition that the Laurent expansion of f(t,t^{-1}) have no positive powers of t? I can prove that if f and g statisfy this condition, and you write the determinant as

D(x,y) = sum c_ij x^i y^j [i,j from 0 on up in this sum]

then

sum c_ii/(i+1) = 0 [i from 0 on up in this sum],

from which I infer that if D is constant then D is 0.

Dennis, that is how I understood the limiting condition. I’d be interested to see your proof.

Okay, I think I worked it out. That is pretty clever!

At the moment, all I have is a bare-handed proof— write out everything in terms of the coefficients of f and g, and observe the cancellation. I have little idea “why” the sum should be 0 (my computer came up with the idea).

I’m now trying to make it conceptual. In my fantasy world, the sum is a kind of “trace”, and the hypotheses on f and g give operatory relations between f_y g_x and f_x g_y that force them to have the same “trace.” If I get anything out of this, I’ll post it.

Go Hawks.

If I’m working it our right, your proof works for quite a larger class of examples. One specific case I like: Write the Laurent expansions for f(t,t^-1) and g(t^-1,t). If the only power of t in both of their supports is t^0, then the Jacobian cannot be a non-zero constant.

Sounds like you guys have some interesting ideas. Care to write up an exposition for the rest of us? (Even it is not in its best form yet.)

[...] December 17, 2008 Posted by davidspeyer in Algebraic Geometry, fun problems. trackback In my previous post, I posed the [...]

[...] “in action” about a problem in algebraic geometry. He first juggles an idea , then discussion ensues , and we have a happy ending where the Jacobi Conjecture remains at large – but we understand [...]