In Europe the standard voltage in homes is 220 V instead of the 120 V used in the United States. Therefore "100-W" European bulb would be intended for use with a 220-V potential difference. (a) If you bring a "100-W" European bulb home to the untied States, what should be its US power rating? (b) How much current will the 100-W European bulb draw in normal use in the United States? [Answers: 29.8 W, 0.248 A]

Respuesta :

Explanation:

In Europe the standard voltage in homes is 220 V instead of the 120 V used in the United States.

100-W European bulb would be intended for use with a 220-V potential difference.

(a) If V = 220 V and P = 100 W

Power :

[tex]P=\dfrac{V^2}{R}\\\\R=\dfrac{V^2}{P}\\\\R=\dfrac{(220)^2}{100}\\\\R=484\ \Omega[/tex]

If you bring a "100-W" European bulb home to the untied States, what should be its US power rating. So,

[tex]P=\dfrac{V^2}{R}\\\\P=\dfrac{(120)^2}{484}\\\\P=29.75\ W[/tex]

or

P = 29.8 watts

(b) Let I is the current will the 100-W European bulb draw in normal use in the United States. So,

[tex]I=\dfrac{V}{R}\\\\I=\dfrac{120}{484}\\\\I=0.248\ A[/tex]

Hence, this is the required solution.

Answer:

Explanation:

Voltage in Europe, V = 220 V

Power, P = 100 W

Let R be the resistance of the bulb.

[tex]R = \frac{V^{2}}{P}[/tex]

[tex]R = \frac{220^{2}}{100}[/tex]

R = 484 ohm

(a)

Voltage in United States , V = 120 V

R = 484 ohm

Let the power is P'

[tex]P = \frac{V^{2}}{R}[/tex]

[tex]P' = \frac{120^{2}}{484}[/tex]

P' = 29.8 Watt

(b)

Let the current is i.

P' = V x i

29.8 = 120 x i

i = 0.248 A