# Uncertainty principle

#### jojimt

I have a basic question on the uncertainty principle. The premise for the uncertainty is that "one will not be able to determine the position of the particle more accurately than the distance between the wave crests of light". I am trying to understand what's the basis of this primary limitation. What limits the precision to wavelength?

Thanks,
Joji

#### Woody

I think you might be confusing two different effects,
There is an "optical" effect where the resolution of an optical instrument (microscope, telescope, ...) is limited by the diffraction experienced by the photons,
which is in turn determined by the wavelength.

The uncertainty principle is a completely different can of worms.
At subatomic scales there is a fundamental unity between position and momentum
it is fundamentally impossible to define them separately to total accuracy.
The more accurately you pin one down, the vaguer the accuracy of the other will become.

This is often further confused by the observation interference effect.
It is impossible to observe something without changing it.
For everyday objects the size of the disturbance is (very) small compared to the objects,
for atomic scales, the disturbance becomes substantial compared to the size of the object.

For example to examine the condition of an atom, you might bounce a photon off it,
however this will change its condition.
It would be similar to trying to track the path of a basket ball by firing tennis balls at it.

2 people

#### jojimt

No, I am just trying to understand the basis of "There is an "optical" effect where the resolution of an optical instrument (microscope, telescope, ...) is limited by the diffraction experienced by the photons,
which is in turn determined by the wavelength." How is this limited by the wavelength? That part is not intuitively obvious to me. [The uncertainty principle itself makes sense given this limitation].

I guess another way to ask my question is "how exactly is the position of a particle measured?". I am guessing the procedure is to emit a wave and measure the time it takes for the reflected wave to be detected by an antenna. The wave emission and reception perhaps cannot be timed more precisely than a wavelength?

Last edited:

#### HallsofIvy

Imagine you are standing in a dark room with many objects, both large and small, around you. The only "tool" you have to determine what the objects are is a flat board that is 6 inches wide. Do you see that you are not going to be able to tell which objects are over 6 inches and which are not?

If you shine a light with wavelength w on something you will not be able to see things that are less than w wide. The problem, then, with reducing the wavelength is that the smaller wavelength has higher energy. The smaller your board is the harder you hit with it! So small objects may be knocked a good distance from where you detected them.

1 person

#### jojimt

Sorry. I don't quite follow this analogy although I do understand the quantization problem if that's what you're referring to. I also understand the shorter wavelength = higher energy part and the resulting inaccuracy in momentum (not position).

The part I don't follow is how the wavelength affects the measurement of position.

The more I think about this, it seems to me that what you detect is the presence of a pulse, not its phase. Perhaps this is what limits the accuracy to the length of the pulse.

#### benit13

I think the title of your post has puzzled people. As Woody points out, the uncertainty principle is something completely different.

No, I am just trying to understand the basis of "There is an "optical" effect where the resolution of an optical instrument (microscope, telescope, ...) is limited by the diffraction experienced by the photons,
which is in turn determined by the wavelength." How is this limited by the wavelength? That part is not intuitively obvious to me. [The uncertainty principle itself makes sense given this limitation].
In a diffraction grating,

$$\displaystyle d \sin \theta = n \lambda$$

where $$\displaystyle d$$ is the spacing, $$\displaystyle \theta$$ is the observed scattering angle, $$\displaystyle \lambda$$ is the wavelength of the light you are using and $$\displaystyle n$$ is an integer.

If $$\displaystyle \lambda > d$$, no solutions exist in the equation and no scattering angles are obtained. If $$\displaystyle \lambda < d$$, you're guaranteed at least one scattering angle that can be observed. This is why the ability to detect certain structures is dependent on the wavelength of your irradiant light. If your wavelength is too large, you get no scattering angles at all. This is also why blue lasers (compared to red lasers) are capable of reading media at a finer resolution and why Blu-rays can contain more information than standard DVDs.

Most substances can be treated like a diffraction grating if you're doing (x-ray) Bragg diffraction, but it can get a bit more spicy when you look at crystal structure.

Last edited:

#### jojimt

Apologies for the title. The line I quoted in my original post is actually from " A brief history of time" in the context of the uncertainty principle. I guess I was lost in that context.

Thanks for that mathematical explanation. I can see how that establishes λ as the ceiling for precision. Thanks!