[math-fun] Some questions from the 2018 Putnam Exam
These are the questions from the recent Putnam Exam that seemed most interesting to me. I've changed some typography, and a word or two, for clarity. (I haven't tried to solve any yet. If anyone posts a solution, maybe include a spoiler warning for that problem number?) —Dan ************************************************************************* ----- A2 Let S_1, S_2, ..., S_(2n−1) be the nonempty subsets of {1,2,...,} in some order, and let M be the (2n−1)×(2n−1) matrix whose (i,j)th entry m(i,j) = 0 if S_i ∩ S_j is empty; m(i,j) = 1 otherwise. ----- ----- A5 Let f : R → R be an infinitely differentiable function satisfying f(0) = 0, f(1) = 1, and f(x) ≥ 0 for all x ∈ R. Show that there exist a positive integer n and a real number x such that the nth derivative of f is negative when evaluated at x: f^(n)(x) < 0. ----- ----- B2 Let n be a positive integer, and let f_n(z) = n +(n−1)z + (n−2)z^2 +··· + z^(n−1). Prove that f_n has no roots in the closed unit disk {z ∈ C: |z| ≤ 1}. ----- ----- B4 Given a real number a, we define a sequence by x_0 = 1, x_1 = x_2 = a, and x_(n+1) = 2 x_n x_(n−1) − x_(n−2) for n ≥ 2. Prove that if x_n = 0 for some n, then the sequence is periodic. ----- ----- B5 Let f = (f_1, f_2) be a function from R^2 to R^2 with continuous partial derivatives ∂f_i/∂x_j that are positive everywhere. Suppose that (∂f_1/∂x_1)(∂f_2/∂x_2) - (1/4)(∂f_1/∂x_2 + ∂f_2/∂x_1)^2 > 0 everywhere. Prove that f is one-to-one. ----- *************************************************************************
A2 is really easy! On Tue, Dec 4, 2018 at 5:37 PM Dan Asimov <dasimov@earthlink.net> wrote:
These are the questions from the recent Putnam Exam that seemed most interesting to me. I've changed some typography, and a word or two, for clarity. (I haven't tried to solve any yet. If anyone posts a solution, maybe include a spoiler warning for that problem number?)
—Dan
************************************************************************* ----- A2 Let S_1, S_2, ..., S_(2n−1) be the nonempty subsets of {1,2,...,} in some order, and let M be the (2n−1)×(2n−1) matrix whose (i,j)th entry m(i,j) = 0 if S_i ∩ S_j is empty; m(i,j) = 1 otherwise. -----
----- A5 Let f : R → R be an infinitely differentiable function satisfying f(0) = 0, f(1) = 1, and f(x) ≥ 0 for all x ∈ R. Show that there exist a positive integer n and a real number x such that the nth derivative of f is negative when evaluated at x:
f^(n)(x) < 0. -----
----- B2 Let n be a positive integer, and let
f_n(z) = n +(n−1)z + (n−2)z^2 +··· + z^(n−1).
Prove that f_n has no roots in the closed unit disk {z ∈ C: |z| ≤ 1}. -----
----- B4 Given a real number a, we define a sequence by x_0 = 1, x_1 = x_2 = a, and
x_(n+1) = 2 x_n x_(n−1) − x_(n−2) for n ≥ 2.
Prove that if x_n = 0 for some n, then the sequence is periodic. -----
----- B5 Let f = (f_1, f_2) be a function from R^2 to R^2 with continuous partial derivatives ∂f_i/∂x_j that are positive everywhere. Suppose that
(∂f_1/∂x_1)(∂f_2/∂x_2) - (1/4)(∂f_1/∂x_2 + ∂f_2/∂x_1)^2 > 0
everywhere. Prove that f is one-to-one. ----- *************************************************************************
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
-- Mike Stay - metaweta@gmail.com http://math.ucr.edu/~mike https://reperiendi.wordpress.com
(Apologies for the double-quoting; either I never got the original or it somehow got spam-binned by my mail program.)
On Tue, Dec 4, 2018 at 5:37 PM Dan Asimov <dasimov@earthlink.net> wrote: >>>> These are the questions from the recent Putnam Exam that seemed most>> interesting to me. I've changed some typography, and a word or two, for>> clarity. (I haven't tried to solve any yet. If anyone posts a solution,>> maybe include a spoiler warning for that problem number?)...>> A5 Let f : R → R be an infinitely differentiable function satisfying>> f(0) = 0, f(1) = 1, and f(x) ≥ 0 for all x ∈ R. Show that there exist>> a positive integer n and a real number x such that the nth derivative>> of f is negative when evaluated at x:>>>> f^(n)(x) < 0. Surely this can't be right! Why can't we take f(x) = x^2?
-- g
On Thu, Dec 13, 2018 at 8:44 PM Gareth McCaughan <gareth.mccaughan@pobox.com> wrote:
(Apologies for the double-quoting; either I never got the original or it somehow got spam-binned by my mail program.)
On Tue, Dec 4, 2018 at 5:37 PM Dan Asimov <dasimov@earthlink.net> wrote: >>>> These are the questions from the recent Putnam Exam that seemed most>> interesting to me. I've changed some typography, and a word or two, for>> clarity. (I haven't tried to solve any yet. If anyone posts a solution,>> maybe include a spoiler warning for that problem number?)...>> A5 Let f : R → R be an infinitely differentiable function satisfying>> f(0) = 0, f(1) = 1, and f(x) ≥ 0 for all x ∈ R. Show that there exist>> a positive integer n and a real number x such that the nth derivative>> of f is negative when evaluated at x:>>>> f^(n)(x) < 0.
Surely this can't be right! Why can't we take f(x) = x^2?
n=1, x=-1. -- Mike Stay - metaweta@gmail.com http://math.ucr.edu/~mike https://reperiendi.wordpress.com
Here's the essential idea: since f(0) = 0 and f(1)=1, if we define y = inf { x>= 0: f(x) > 0}, we have y in [0,1). Since f is infinitely differentiable, it can't hold that f(x) = 0 for all x <= y. So f must be decreasing somewhere before y, and thus its derivative is negative. On Fri, Dec 14, 2018 at 10:26 AM Mike Stay <metaweta@gmail.com> wrote:
On Thu, Dec 13, 2018 at 8:44 PM Gareth McCaughan <gareth.mccaughan@pobox.com> wrote:
(Apologies for the double-quoting; either I never got the original or it somehow got spam-binned by my mail program.)
On Tue, Dec 4, 2018 at 5:37 PM Dan Asimov <dasimov@earthlink.net> wrote: >>>> These are the questions from the recent Putnam Exam that seemed most>> interesting to me. I've changed some typography, and a word or two, for>> clarity. (I haven't tried to solve any yet. If anyone posts a solution,>> maybe include a spoiler warning for that problem number?)...>> A5 Let f : R → R be an infinitely differentiable function satisfying>> f(0) = 0, f(1) = 1, and f(x) ≥ 0 for all x ∈ R. Show that there exist>> a positive integer n and a real number x such that the nth derivative>> of f is negative when evaluated at x:>>>> f^(n)(x) < 0.
Surely this can't be right! Why can't we take f(x) = x^2?
n=1, x=-1. -- Mike Stay - metaweta@gmail.com http://math.ucr.edu/~mike https://reperiendi.wordpress.com
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
On 14/12/2018 16:11, Victor Miller wrote:
Here's the essential idea: since f(0) = 0 and f(1)=1, if we define y = inf { x>= 0: f(x) > 0}, we have y in [0,1). Since f is infinitely differentiable, it can't hold that f(x) = 0 for all x <= y. So f must be decreasing somewhere before y, and thus its derivative is negative.
At the risk of once again showing myself to be a moron, that doesn't sound right. Suppose g(x) = 0 for x<=0 and g(x) = exp(-1/x^2) for x>0. Then g is infinitely differentiable but has the property you say f can't have on account of being infinitely differentiable. (I am not claiming that g or anything like it is a counterexample to the theorem the question is asking for a proof of. It's easy to find places where, say, its second derivative is zero. But it's a counterexample to the I-think-not-a-theorem that says that an infinitely differentiable function can't be 0 for all x <= some x0 and strictly increasing thereafter.) -- g
Indeed, I tried to find a counterexample to the problem in the form of: f(x) := 0 (x <= 0); f(x) := exp(1 - 1/x) (otherwise); but its second derivative is annoyingly negative somewhere. Best wishes, Adam P. Goucher
Sent: Saturday, December 15, 2018 at 12:01 AM From: "Gareth McCaughan" <gareth.mccaughan@pobox.com> To: math-fun@mailman.xmission.com Subject: Re: [math-fun] Some questions from the 2018 Putnam Exam
On 14/12/2018 16:11, Victor Miller wrote:
Here's the essential idea: since f(0) = 0 and f(1)=1, if we define y = inf { x>= 0: f(x) > 0}, we have y in [0,1). Since f is infinitely differentiable, it can't hold that f(x) = 0 for all x <= y. So f must be decreasing somewhere before y, and thus its derivative is negative.
At the risk of once again showing myself to be a moron, that doesn't sound right. Suppose g(x) = 0 for x<=0 and g(x) = exp(-1/x^2) for x>0. Then g is infinitely differentiable but has the property you say f can't have on account of being infinitely differentiable.
(I am not claiming that g or anything like it is a counterexample to the theorem the question is asking for a proof of. It's easy to find places where, say, its second derivative is zero. But it's a counterexample to the I-think-not-a-theorem that says that an infinitely differentiable function can't be 0 for all x <= some x0 and strictly increasing thereafter.)
-- g
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
Hi, did you make progress on this? I ran into the same example you mentioned. On 12/14/18 16:01 , Gareth McCaughan wrote:
On 14/12/2018 16:11, Victor Miller wrote:
Here's the essential idea: since f(0) = 0 and f(1)=1, if we define y = inf { x>= 0: f(x) > 0}, we have y in [0,1). Since f is infinitely differentiable, it can't hold that f(x) = 0 for all x <= y. So f must be decreasing somewhere before y, and thus its derivative is negative.
At the risk of once again showing myself to be a moron, that doesn't sound right. Suppose g(x) = 0 for x<=0 and g(x) = exp(-1/x^2) for x>0. Then g is infinitely differentiable but has the property you say f can't have on account of being infinitely differentiable.
(I am not claiming that g or anything like it is a counterexample to the theorem the question is asking for a proof of. It's easy to find places where, say, its second derivative is zero. But it's a counterexample to the I-think-not-a-theorem that says that an infinitely differentiable function can't be 0 for all x <= some x0 and strictly increasing thereafter.)
1. Take f(0) = 0. Then, look at the sign of the first non-zero derivative at x = 0. If it's positive, then f(x) < 0 for x < 0 sufficiently close to zero, absurd. If it's negative, then f(x) > 0 for x > 0 sufficiently close to zero, absurd again. So all derivatives must be zero at x = 0. 2. Look at f(x) with x < 0. If f(x) > 0, then by Lagrange's theorem f'(t) < 0 for x < t < 0, done. Otherwise, f(x) = 0 for all x < 0. 3. Look at f(1) = 1, and assume all derivatives are non-negative. Remembering that f'(0) = 0, by Lagrange's theorem it must be f'(t) > 1 for 0 < t < x. Since f(x) is infinitely differentiable, this argument applies recursively. Consequently, we have: a. t_n -> 0, given by recursively applyling Lagrange's theorem, b. df / dx^n f(t_n) > 1 Therefore, lim n->+oo df/dx^n f(t_n) > 0, absurd because by now all derivatives must be zero at x = 0. Consequently, some derivative must be negative somewhere. Andres. On 12/31/18 18:45 , Andres Valloud wrote:
Hi, did you make progress on this? I ran into the same example you mentioned.
On 12/14/18 16:01 , Gareth McCaughan wrote:
On 14/12/2018 16:11, Victor Miller wrote:
Here's the essential idea: since f(0) = 0 and f(1)=1, if we define y = inf { x>= 0: f(x) > 0}, we have y in [0,1). Since f is infinitely differentiable, it can't hold that f(x) = 0 for all x <= y. So f must be decreasing somewhere before y, and thus its derivative is negative.
At the risk of once again showing myself to be a moron, that doesn't sound right. Suppose g(x) = 0 for x<=0 and g(x) = exp(-1/x^2) for x>0. Then g is infinitely differentiable but has the property you say f can't have on account of being infinitely differentiable.
(I am not claiming that g or anything like it is a counterexample to the theorem the question is asking for a proof of. It's easy to find places where, say, its second derivative is zero. But it's a counterexample to the I-think-not-a-theorem that says that an infinitely differentiable function can't be 0 for all x <= some x0 and strictly increasing thereafter.)
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun .
On Wed, Jan 9, 2019 at 9:43 AM Andres Valloud <avalloud@smalltalk.comcastbiz.net> wrote:
Therefore, lim n->+oo df/dx^n f(t_n) > 0, absurd because by now all derivatives must be zero at x = 0.
If lim n-> inf (d^kf/dx^k) > 0 for some fixed k, that would show that the kth derivative of f at 0 was non-zero, a contradiction. But how do you get a contradiction from lim n-> inf (d^n f /dx^n) > 0? All the derivatives start at zero when x = 0, and grow as x increases, and the larger n is, the faster the nth derivative grows. But why is this a contradiction? If there was an "infiitieth" derivative, that's the limit of the nth derivative as n goes to infinity, that is also continuous, you've shown that the infinitieth derivative at 0 is non-zero, a contradiction since all the finite derivatives (and hence the infiniteth derivative) is 0 at 0. But I don't think the fact that a function is infinitely differentiable shows that it also has a continuous "infinitieth" derivative, so why is the thing you label as "absurd" impossible? Andy
Consequently, some derivative must be negative somewhere.
Andres.
On 12/31/18 18:45 , Andres Valloud wrote:
Hi, did you make progress on this? I ran into the same example you mentioned.
On 12/14/18 16:01 , Gareth McCaughan wrote:
On 14/12/2018 16:11, Victor Miller wrote:
Here's the essential idea: since f(0) = 0 and f(1)=1, if we define y = inf { x>= 0: f(x) > 0}, we have y in [0,1). Since f is infinitely differentiable, it can't hold that f(x) = 0 for all x <= y. So f must be decreasing somewhere before y, and thus its derivative is negative.
At the risk of once again showing myself to be a moron, that doesn't sound right. Suppose g(x) = 0 for x<=0 and g(x) = exp(-1/x^2) for x>0. Then g is infinitely differentiable but has the property you say f can't have on account of being infinitely differentiable.
(I am not claiming that g or anything like it is a counterexample to the theorem the question is asking for a proof of. It's easy to find places where, say, its second derivative is zero. But it's a counterexample to the I-think-not-a-theorem that says that an infinitely differentiable function can't be 0 for all x <= some x0 and strictly increasing thereafter.)
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun .
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
-- Andy.Latto@pobox.com
Oh I see, fixed t_n the n-th derivative can always run away with a bigger epsilon that requires a yet smaller delta. That's too bad :/. On 1/9/19 8:13 , Andy Latto wrote:
On Wed, Jan 9, 2019 at 9:43 AM Andres Valloud <avalloud@smalltalk.comcastbiz.net> wrote:
Therefore, lim n->+oo df/dx^n f(t_n) > 0, absurd because by now all derivatives must be zero at x = 0.
If lim n-> inf (d^kf/dx^k) > 0 for some fixed k, that would show that the kth derivative of f at 0 was non-zero, a contradiction. But how do you get a contradiction from lim n-> inf (d^n f /dx^n) > 0? All the derivatives start at zero when x = 0, and grow as x increases, and the larger n is, the faster the nth derivative grows. But why is this a contradiction? If there was an "infiitieth" derivative, that's the limit of the nth derivative as n goes to infinity, that is also continuous, you've shown that the infinitieth derivative at 0 is non-zero, a contradiction since all the finite derivatives (and hence the infiniteth derivative) is 0 at 0. But I don't think the fact that a function is infinitely differentiable shows that it also has a continuous "infinitieth" derivative, so why is the thing you label as "absurd" impossible?
Andy
Consequently, some derivative must be negative somewhere.
Andres.
On 12/31/18 18:45 , Andres Valloud wrote:
Hi, did you make progress on this? I ran into the same example you mentioned.
On 12/14/18 16:01 , Gareth McCaughan wrote:
On 14/12/2018 16:11, Victor Miller wrote:
Here's the essential idea: since f(0) = 0 and f(1)=1, if we define y = inf { x>= 0: f(x) > 0}, we have y in [0,1). Since f is infinitely differentiable, it can't hold that f(x) = 0 for all x <= y. So f must be decreasing somewhere before y, and thus its derivative is negative.
At the risk of once again showing myself to be a moron, that doesn't sound right. Suppose g(x) = 0 for x<=0 and g(x) = exp(-1/x^2) for x>0. Then g is infinitely differentiable but has the property you say f can't have on account of being infinitely differentiable.
(I am not claiming that g or anything like it is a counterexample to the theorem the question is asking for a proof of. It's easy to find places where, say, its second derivative is zero. But it's a counterexample to the I-think-not-a-theorem that says that an infinitely differentiable function can't be 0 for all x <= some x0 and strictly increasing thereafter.)
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun .
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
I think you left out the punchline in A2. Once I've got this gigungous matrix, then what? On Tue, Dec 4, 2018 at 7:37 PM Dan Asimov <dasimov@earthlink.net> wrote:
These are the questions from the recent Putnam Exam that seemed most interesting to me. I've changed some typography, and a word or two, for clarity. (I haven't tried to solve any yet. If anyone posts a solution, maybe include a spoiler warning for that problem number?)
—Dan
************************************************************************* ----- A2 Let S_1, S_2, ..., S_(2n−1) be the nonempty subsets of {1,2,...,} in some order, and let M be the (2n−1)×(2n−1) matrix whose (i,j)th entry m(i,j) = 0 if S_i ∩ S_j is empty; m(i,j) = 1 otherwise. -----
----- A5 Let f : R → R be an infinitely differentiable function satisfying f(0) = 0, f(1) = 1, and f(x) ≥ 0 for all x ∈ R. Show that there exist a positive integer n and a real number x such that the nth derivative of f is negative when evaluated at x:
f^(n)(x) < 0. -----
----- B2 Let n be a positive integer, and let
f_n(z) = n +(n−1)z + (n−2)z^2 +··· + z^(n−1).
Prove that f_n has no roots in the closed unit disk {z ∈ C: |z| ≤ 1}. -----
----- B4 Given a real number a, we define a sequence by x_0 = 1, x_1 = x_2 = a, and
x_(n+1) = 2 x_n x_(n−1) − x_(n−2) for n ≥ 2.
Prove that if x_n = 0 for some n, then the sequence is periodic. -----
----- B5 Let f = (f_1, f_2) be a function from R^2 to R^2 with continuous partial derivatives ∂f_i/∂x_j that are positive everywhere. Suppose that
(∂f_1/∂x_1)(∂f_2/∂x_2) - (1/4)(∂f_1/∂x_2 + ∂f_2/∂x_1)^2 > 0
everywhere. Prove that f is one-to-one. ----- *************************************************************************
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
participants (8)
-
Adam P. Goucher -
Allan Wechsler -
Andres Valloud -
Andy Latto -
Dan Asimov -
Gareth McCaughan -
Mike Stay -
Victor Miller