SNR in Xray imaging The problem statement, all variables and given/known data
(1) An ideal digital detector only suffers from quantum noise. If, after being exposed to 5 µGy the mean pixel value in the image is 100 and the standard deviation of the pixel values in the image is 5, calculate the SNR?
The relationship between pixel value and detector dose is linear.
(2) What is the effect on SNR of applying a linear gain of factor 4 to increase all pixel values Attempt at a solution
As I understand SNR = 100/5 = 20
(but I am not certain; it could be 100/sqrt(10))  clarification would be helpful
Also I think gain has no effect on SNR (as it increases signal and noise by the same amount) but am not certain.
Thanks for your help
