Answer:
The distance between interference fringes increases.
Explanation:
In a double-slit diffraction pattern, the distance of the n-order fringe from the centre of the pattern is
[tex]y=\frac{n \lambda D}{d}[/tex]
where [tex]\lambda[/tex] is the wavelength of the light, D the distance of the screen, and d the separation between the slits.
If we take two adjacent fringes, n and (n+1), their distance is
[tex]\Delta y = \frac{(n+1)\lambda D}{d}-\frac{n\lambda D}{d}=\frac{\lambda D}{d}[/tex]
so, we see that it is inversely proportional to the slit separation, d.
Therefore, if the separation between the slits decreases, the distance between the interference fringes increases.