When an apple is dropped from a tower 256 feet high, the function h(t) =-167 +256 models the height of the
apple, in feet, after t seconds. Determine, algebraically, the number of seconds it takes the apple to hit the ground.

Respuesta :

Answer:

around 1.5 sec

Step-by-step explanation:

basically you wanna figure out at what time is the height=0

since h(t) represents height, set it to 0 then solve for t

i believe you might have forgotten the t in the equation so i assumed it was -167t

0=-167t+256

-256=-167t

t=1.53293413174

around 1.5 seconds after it was dropped

alternatively, you could plug the equation into desmos, replacing h(t) with y and t with x and find the x intercept

The apple dropping on the ground is an illustration of free fall.

It takes the apple 1.53 seconds to hit the ground

Given

[tex]h(t) = -167t + 256[/tex]

When the apple hits the ground, we have:

[tex]h(t) = 0[/tex]

Substitute [tex]h(t) = 0[/tex] in [tex]h(t) = -167t + 256[/tex]

[tex]0 = -167t + 256[/tex]

Collect like terms

[tex]167t = 256[/tex]

Solve for t

[tex]t = \frac{256}{167}[/tex]

[tex]t = 1.53293413174[/tex]

Approximate

[tex]t = 1.53[/tex]

Hence, the apple hits the ground after 1.53 seconds

Read more about free falls at:

https://brainly.com/question/10579318