### A problem with polynomial functions

Here is a problem that popped up in my head about a year ago.

Problem 1. A function $f: \mathbb {Z} \times \mathbb {Z} \to \mathbb {Z}$ is polynomial of each argument, i.e. for every $x_0 \in \mathbb {Z}$ the function $y \to f(x_0, y)$ can be represented by a polynomial with integer coefficients, and the same goes for $x \to f(x, y_0)$ for every $y_0$. Is it true then that $f$ must be polynomial of both arguments simultaneously, i.e. representable by an element of $\mathbb {Z}[x,y]$?

If you’ve ever been a freshman in an analysis course, you are bound to be aware of a common trap with the notion of differentiability. The trap is like this: you are asked to prove that a function $g: \mathbb {R} \times \mathbb {R} \to \mathbb {R}$ is differentiable everywhere. You have several months of single-variable analysis behind your back. So your first instinct may be to prove that $\partial g / \partial x$ and $\partial g / \partial y$ exist at every point and be done with it. But this is not enough, as demonstrated by

$\displaystyle g(x,y) = \left\{ \begin{array}{ll}0 & \quad \mathrm{at}\, \mathrm{point}\, (0,0) \\ \frac{xy}{\sqrt {x^2+y^2}} & \quad \mathrm{everywhere}\, \mathrm{else.}\end{array}\right.$

Problem 1 asks if there is the same trap with polynomiality (not a word?) as there is with differentiability. When I came up with problem 1 I was busy with more “serious” math, so the question didn’t get any real thought. I always suspected that it would’t take long to solve, but somehow I was too lazy to do the job. That is, I was until this morning, when I finally put in the required several minutes and got this thing sorted out. If you’re curious, you might want to do the same before moving on to the next paragraph.

Without further ado, the answer to problem 1: no. A function can be polynomial of each argument, but not polynomial overall. Here is an example:

$\displaystyle f(x,y) = \sum _{n=0}^\infty \prod _{k=0}^ n (x^2-k^2)(y^2-k^2).$

The sum is actually finite at every point $(x,y)$: all the terms with $n \geqslant \min (|x|,|y|)$ are zero. Also, if we fix $x_0 \in \mathbb {Z}$, then $f(x_0,y)$ becomes a polynomial of $y$ with degree at most $2|x_0|$.

It remains to prove that $f(x,y)$ is not polynomial itself. But if it were, then $g(x)=f(x,x)$ would too. (I wonder if this was grammatically correct). And it is clear that $g(x)$ grows faster than any polynomial, so we are done.

Now, what if we replace $\mathbb {Z}$ with another ring? Something larger maybe, like all rational numbers, or all real numbers. Will the trick still work? In case of the rationals it will, with minor modifications.

Exercise. Problem 1 still has a negative answer when $\mathbb {Z}$ is replaced with $\mathbb {Q}$.

But if we move on from $\mathbb {Z}$ to the uncountable $\mathbb {R}$, the trick stops working. In fact, the problem changes its answer.

Fact of life. If a function $f:\, \mathbb {R} \times \mathbb {R} \to \mathbb {R}$ is polynomial of each argument, then it is polynomial.

Proof. Let’s introduce some new notation for convenience. For every $x \in \mathbb {R}$, define $l_ x:\, \mathbb {R} \to \mathbb {R}$ by $l_ x(y) = f(x,y)$. Similarly, for every $y \in \mathbb {R}$ let $r_ y:\, \mathbb {R} \to \mathbb {R}$, $r_ y(x) = f(x,y)$. We know that all the functions $l_ x$ and $r_ y$ are polynomial.

First, we prove that there is a natural $n$ and an infinite set $X \subset \mathbb {R}$ such that $\deg l_ x \leqslant n$ for every $x \in X$. To do this, for every $k\in \mathbb {N}$ define $X_ k = \{ x \in \mathbb {R} \, |\, \deg l_ x \leq k\}$. Clearly, $\mathbb {R} = \bigcup _{k \in \mathbb {N}} X_ k$. $\mathbb {R}$ is uncountably infinite and $\mathbb {N}$ is countable. It follows that there is a $k$ such that $X_ k$ is infinite, which gives us the desired $n$ and $X$.

OK, so for every $x \in X$ we know that $l_ x$ is a polynomial function with degree at most $n$. This means that there are functions $a_0, a_1, \ldots , a_ n: X \to \mathbb {R}$ such that

$\displaystyle l_ x(y) = f(x,y) = a_0(x) + a_1(x)y + \ldots + a_ n(x)y^ n$

for all $x \in X$, $y \in \mathbb {R}$.

The next order of business is to prove that each $a_ i$ is representable by a polynomial, i.e. that there is a $b_ i \in \mathbb {R}[x]$ such that $a_ i(x) = b_ i(x)$ for all $x \in X$. This is immediately obvious for $a_0$, because $a_0 = r_0|_ X$. It is only a bit more difficult for an arbitrary $a_ i$. It is an easy exercise to show that coefficients of a polynomial with degree at most $n$ can be found as linear combinations of its values at $n+1$ distinct points. This means that for every $i=\overline{0,n}$ there exist constants $c_0, c_1, \ldots , c_ n \in \mathbb {R}$ such that

$\displaystyle a_ i(x) = \sum _{j=0}^ n c_ j f(x,j)$

for all $x \in X$, and thus $a_ i$ is a linear combination of functions $r_ j|_ X$. Therefore, $a_ i$ is polynomial.

And we are almost done. We have polynomials $b_ i \in \mathbb {R}[x]$, $i = \overline{0,n}$, that have the same values as $a_ i$ on set $X$. Now build a new polynomial function $g$ defined on all of $\mathbb {R} \times \mathbb {R}$ like this:

$\displaystyle g(x,y) = b_0(x) + b_1(x)y + \ldots + b_ n(x)y^ n.$

It is obvious that $g|_{X \times \mathbb {R}} = f|_{X \times \mathbb {R}}$. If we fix an arbitrary $y_0 \in \mathbb {R}$, we see that functions $x \to g(x,y_0)$ and $x \to f(x,y_0)$ are both polynomial and have the same restriction to $X$. Since $X$ is infinite, it follows that these two functions coincide everywhere, i.e. $g(x,y_0)=f(x,y_0)$ for every $x \in \mathbb {R}$. But then $g$ and $f$ are the same function, and the proof is complete.