Problem
A kite flies with a constant height of 60 meters above ground. The speed is $2 \frac ms$. How fast is the line released the moment $100$ meters of it is released?
My attempt
I visualized this as a right triangle, where the hypotenuse $L$ is the line, the horizontal edge is $x$ with $x' = 2\frac ms$ and the vertical edge is $60\mathrm m$.
I tried creating a function, $$x(L) = \sqrt{L^2 – 60^2}$$
which gives $$x'(L) = \frac{L}{\sqrt{L^2 – 60^2}}$$
From here I'm unsure what to do. What I want to know is $L'$ when $L = 100$ right?
This is where I'm stuck. Any help appreciated!
Best Answer
Note that the length of the horizontal edge as a function of time is $x(t)=2t$. Hence by the Pythagorean theorem: $$L(t)=\sqrt{60^{2}+x(t)^{2}}=\sqrt{60^{2}+4t^{2}}.$$ Let's see when $L(t)=100$: $$100=\sqrt{60^{2}+4t^{2}} \Leftrightarrow t=40.$$
And at $t=40$ the line is released with speed $$L'(40)=\frac{8\cdot 40}{2\sqrt{60^{2}+4\cdot40^{2}}}=1.6\ \frac{m}{s}.$$