|10-30-2012, 06:46 AM||#1|
Join Date: Oct 2012
Why does power of a sound wave remain constant with distance, but not its amplitude?
Suppose a sound wave is emitted uniformly in all directions by a speaker.
We know that Intensity of sound = Power/Area.
1) As distance increases, the power of the sound wave has to remain constant. Intensity thus decreases right?
2) if power of the sound wave remains constant, this also means that the amplitude of sound wave remains constant since power is proportional to amplitude squared.
However i came across this question:
At a distance of 1.1m from the speaker, the amplitude of sound is 1.2x10^ -8m. What is the amplitude of sound at a distance of 1.7m from the speaker?
Why does the amplitude of sound change based on my explanation above? which part of my explanation is flawed? I have thought about this for many hours its really troubling me! please help!
|10-30-2012, 07:36 AM||#2|
Join Date: Jun 2010
Last edited by ChipB; 10-30-2012 at 08:53 AM.
|10-31-2012, 08:57 PM||#3|
Join Date: Feb 2009
Thanks Chip.The fall in intensity is mainly due to the increase in the surface area and this generally follows the inverse square law when we consider a spherical surface and a point source. However if we consider a parallel beam, the intensity does not fall at all ideally. Lasers are a good example . Hence at large distances where we can consider rays to be parallel, the intensity does not vary much with a change in distance. I know this is not directly related to the question, but just thought i would put in a word.