X-rays are reflected from a simple cubic crystal by Bragg reflection. ...
Degrees, what is the smallest wavelength that can be reliably detected using this crystal?
We can use Bragg's Law to relate the angle of reflection to the spacing between crystal planes and the wavelength of the incident X-rays:
nλ = 2d sinθ
where n is an integer (the order of the reflection), λ is the wavelength of the incident X-rays, d is the spacing between crystal planes, and θ is the angle of reflection.
For a simple cubic crystal, the spacing between crystal planes is given by:
d = a / √(h^2 + k^2 + l^2)
where a is the length of the cubic unit cell and h, k, and l are the Miller indices of the crystal plane.
Since the crystal has a density measurement with an r.m.s. error of 3 parts in 10^4, we can estimate the error in the length of the unit cell using:
Δa / a = 1/2 (Δρ / ρ)
where Δρ / ρ is the fractional error in density. Plugging in the given values, we get:
Δa / a = 1/2 (3/10^4) = 1.5/10^4
So the error in the spacing between crystal planes is:
Δd / d = 1/2 (Δa / a) = 7.5/10^4
Now we can use these values to estimate the smallest wavelength that can be reliably detected. We want to find the smallest value of λ that produces a detectable reflection, which means we want to use n = 1 in Bragg's Law. Rearranging the equation and plugging in the values, we get:
λ = 2d sinθ / n = 2a sinθ / √(h^2 + k^2 + l^2)
Δλ / λ = (Δd / d) + (Δθ / sinθ)
where Δθ is the error in the angle of reflection. Plugging in the given values, we get:
Δλ / λ = (7.5/10^4) + (0.017/0.105) = 0.182
So the smallest wavelength that can be reliably detected is:
λ_min = λ (1 + Δλ / λ) = (2.0 Å) (1 + 0.182) = 2.36 Å
Therefore, the smallest wavelength that can be reliably detected using this crystal is 2.36 Å.