An air traffic controller spots two airplanes at the same altitude converging to a point as they fly at right angles to each other. One airplane is 150 miles from the point and has a speed of 600 miles per hour. The other is 200 miles from the point and has a speed of 800 miles per hour.

(a)At what rate is the distance between the planes changing?

(b) How much time does the controller have to get one of the airplanes on a different flight path?

Respuesta :

irspow
They are traveling at right angles to each other so we can say one is traveling north to south and the other west to east.  Then we can say that there positions, y and x are:

y=150-600t  x=200-800t

By using the Pythagorean Theorem we can find the distance between these two planes as a function of time:

d^2=y^2+x^2, using y and x from above

d^2=(150-600t)^2+(200-800t)^2

d^2=22500-180000t+360000t^2+40000-320000t+640000t^2

d^2=1000000t^2-500000t+62500

d=√(1000000t^2-500000t+6250)

So the rate of change is the derivative of d

dd/dt=(1/2)(2000000t-500000)/√(1000000t^2-500000t+6250)

dd/dt=(1000000t-250000)/√(1000000t^2-500000t+6250)

So the rate depends upon t and is not a constant, so for the instantaneous rate you would plug in a specific value of t...

...

To find how much time the controller has to change the airplanes flight path, we only need to solve for when d=0, or even d^2=0...

1000000t^2-500000t+62500=0

6250(16t^2-8t+1)=0

6250(16^2-4t-4t+1)=0

6250(4t(4t-1)-1(4t-1))=0

6250(4t-1)(4t-1)=0

6250(4t-1)^2=0

4t-1=0

4t=1

t=1/4 hr

Well technically, the controller has t<1/4 because at t=1/4 impact will occur :)