Originally Posted by **EME** I came across this question: A pupil suggests that a more accurate value for the wavelength of the laser light can be found if a grating with a slit separation of 2x10^-6m is used (it was formerly 5x10^-6m). Explain why this suggestion is correct.
I kind of understand but don't know how to write down in words. |

If you look up the "grating equation" you should find

*s* [(sin(

**i**) + sin(

**d**)] =

**n** lambda

where

**s** is the spacing of lines in the grating,

**i** is the angle of incidence,

**d** is the angle of the diffracted light, and (

**n** lambda) is an integer (order number) times the wavelength of the light. If the grating spacing is made smaller, then the diffraction angle will be greater and can be measured with greater relative precision. [I would say "precision" is a more relevant term than "accuracy" for this measurement.]

Simply stated, the smaller the spacing, the greater the diffraction angle.