If a quadratic-time algorithm requires 3 minutes to process 100 data values on a particular computer, approximately how long would it take for the algorithm to process 10,000 data values?

Respuesta :

Answer: The answer is 30 minutes

Step-by-step explanation:

First, let us calculate the number of data value processed in 1 minute:

3min ------> 100 data processed

1min------->x data processed.

Cross multiple: 3min* x = 1min*100data

Make x the subject: x= 100/3data.

We now have the data value processed in 1 minute, so we can go ahead and calculate the time required to process 1000 data value as follows:

1min ---------- >100/3 data

x min -------->1000 data

Cross multiply: 100/3 x = 1*1000

(100/3)x=1000.

Now make x the subject:

x= 1000/(100/3)

x=1000*3/100

x=10*3

x=30 min.

Hence we need 30 minutes to process 1000 data values.