I think the title of your post has puzzled people. As Woody points out, the uncertainty principle is something completely different.

Originally Posted by **jojimt** No, I am just trying to understand the basis of "There is an "optical" effect where the resolution of an optical instrument (microscope, telescope, ...) is limited by the diffraction experienced by the photons,
which is in turn determined by the wavelength." How is this limited by the wavelength? That part is not intuitively obvious to me. [The uncertainty principle itself makes sense given this limitation]. |

In a diffraction grating,

$\displaystyle d \sin \theta = n \lambda$

where $\displaystyle d$ is the spacing, $\displaystyle \theta$ is the observed scattering angle, $\displaystyle \lambda$ is the wavelength of the light you are using and $\displaystyle n$ is an integer.

If $\displaystyle \lambda > d$, no solutions exist in the equation and no scattering angles are obtained. If $\displaystyle \lambda < d$, you're guaranteed at least one scattering angle that can be observed. This is why the ability to detect certain structures is dependent on the wavelength of your irradiant light. If your wavelength is too large, you get no scattering angles at all. This is also why blue lasers (compared to red lasers) are capable of reading media at a finer resolution and why Blu-rays can contain more information than standard DVDs.

Most substances can be treated like a diffraction grating if you're doing (x-ray) Bragg diffraction, but it can get a bit more spicy when you look at crystal structure.