Using the mean concept, it is found that on average, it takes him 11.9 seconds to run a 100 yard dash.
The mean of a data-set is given by the sum of all observations in the data-set divided by the number of observations.
In this problem, the three observations are given by:
12.7 s, 11.2 s, 11.8 s.
Hence the mean is:
M = (12.7 + 11.2 + 11.8)/3 = 11.9.
On average, it takes him 11.9 seconds to run a 100 yard dash.
More can be learned about the mean of a data-set at https://brainly.com/question/24628525
#SPJ1