Given that the height of the baseball is modeled by the equation:
h(t)=-1.85t^2+20t+1
where t is time in seconds and h is the height in meters
The time taken for the ball to hit the ground will be found as follows;
at the point when the ball hits the ground, h(t)=0, therefore our equation will be:
-1.85t^2+20t+1=0
solving the above quadratic equation for t we get:
t=[-b+/-sqrt(b^2-4ac)]/(2a)
substituting the values into the above formula we get:
t=[-20+\-sqrt(20^2+7.4)]/(-3.7)
t=-0.049771
or
t=10.8606
but since there is no negative time values we shall select:
t=10.8606
hence, we conclude that the time taken for the ball to hit the ground was:
t=10.8606 seconds