Suppose it is desired to estimate the average time a customer spends in Dollar Tree to within 5 minutes at 99% reliability. It is estimated that the standard deviation of the times is 15 minutes. How large a sample should be taken to get the desired interval?