Problem promoted

At the end of my last post I posed a problem to which I don’t know the solution, and which am very curious to see how our readers might tackle. It occurs to me that, if I want people to think about a problem, I shouldn’t put it at the end of a long post, below the fold. So

Problem Let f(x,y) and g(x,y) be polynomials in two variables such that \lim_{t \to \infty} f(t,t^{-1}) and \lim_{t \to \infty} g(t,t^{-1}) exist. Show that the determinant 

(\partial f/\partial x) (\partial g/\partial y) - ( \partial f/\partial y) (\partial g/\partial x )

is not a nonzero constant. (Equivalently, show that it has a zero for some (x,y) \in \mathbb{C}^2.)

One warning: the zero is not always on the hyperbola \{ (t,t^{-1}) \}. For example, if (f,g) = (y, xy-1) then the determinant is

0 \cdot x - y \cdot 1=-y,

which does not vanish on the hyperbola.

9 thoughts on “Problem promoted

  1. David, I didn’t want you thinking that nobody took an interest in your problem. In fact, I’ve spent way too much time thinking about it.

    My line of attack is write f(x,y)=f_0(y)+f_1(y)x+…+f_m(y)x^m and write g(x,y)=g_0(y)+g_1(y)x+…+g_n(y)x^n similarly. Using the limit condition, we know y|f_m(y) and y|g_n(y). We may as well assume n is minimal, and both f_m and g_n are monic.

    Then, using the Jacobian condition we get m*f_m*g_n’ – n*g_n*f_m’=0. Simplifying and using the argument principle, we arrive at f_m=c(g_n)^{m/n}, and since these are both monic we get c=1. Now, we can replace g by g-f if we like, so we see that n<m in any case. At this point we must have y^2|y^(m/n)|f_m. Thus y|f_{m-1}, by the limit condition.


    Anyway, I think I eventually dealt with the n=1 case, but it got too complicated after that.

  2. Is the limiting condition on f(x,y) equivalent to the naive condition that the Laurent expansion of f(t,t^{-1}) have no positive powers of t? I can prove that if f and g statisfy this condition, and you write the determinant as

    D(x,y) = sum c_ij x^i y^j [i,j from 0 on up in this sum]


    sum c_ii/(i+1) = 0 [i from 0 on up in this sum],

    from which I infer that if D is constant then D is 0.

  3. At the moment, all I have is a bare-handed proof— write out everything in terms of the coefficients of f and g, and observe the cancellation. I have little idea “why” the sum should be 0 (my computer came up with the idea).

    I’m now trying to make it conceptual. In my fantasy world, the sum is a kind of “trace”, and the hypotheses on f and g give operatory relations between f_y g_x and f_x g_y that force them to have the same “trace.” If I get anything out of this, I’ll post it.

    Go Hawks.

  4. If I’m working it our right, your proof works for quite a larger class of examples. One specific case I like: Write the Laurent expansions for f(t,t^-1) and g(t^-1,t). If the only power of t in both of their supports is t^0, then the Jacobian cannot be a non-zero constant.

  5. Sounds like you guys have some interesting ideas. Care to write up an exposition for the rest of us? (Even it is not in its best form yet.)

Comments are closed.