Page updated: April 12, 2020
Author: Curtis Mobley
View PDF

Beam and Point Spread Functions

This page defines beam and point spread functions and introduces their use in image analysis.

The Beam Spread Function

Consider a well collimated source emitting spectral power P in direction 𝜃 = 0, as illustrated in Fig. 1. As the beam passes through the medium, scattering will spread out the beam as illustrated by the green arrows in the figure. Absorption will reduce the beam power. The combined effects of scattering and absorption give some spectral irradiance E(r,𝜃) on the surface of a sphere of radius r at an angle of 𝜃 relative to the direction of the emitted beam. The beam spread function (BSF) is then defined as the detected irradiance normalized by the emitted power:

BSF(r,𝜃) E(r,𝜃) P W nmm2 W nm = m2 . (1)

The irradiance sensor used to measure E(r,𝜃) is a plane irradiance sensor, which has a cosine response for angles relative to the normal to the detector surface.


PIC

Figure 1: Geometry for defining the beam spread function.

The Point Spread Function

Now consider a source that emits spectral intensity I(γ) in a cosine pattern as illustrated by the short green arrows in Fig. 2. The angular distribution of the emitted intensity is

I(γ) = P π cosγfor0 γ π 2 W nm sr = 0forπ 2 < γ π

Integrating this intensity over all solid angles shows that the total emitted spectral power is P:

02π0πI(γ)sinγdγdα = P.

Here α is the azimuthal angle of the emitted intensity, which is independent of α.

As illustrated by the long green arrows in Fig. 2, the emitted intensity will give rise to a radiance L(r,𝜃), where r is the distance from the source and 𝜃 is direction measured from the γ = 0 axis of the emitted intensity.

With these definitions, the point spread function (PSF) is defined as the detected radiance normalized by the maximum of the emitted intensity:

PSF(r,𝜃) L(r,𝜃) P π W m2nmsr W nmsr = m2. (2)

PIC

Figure 2: Geometry for defining the point spread function.

Figure 3 shows a photograph of a street light seen through a foggy atmosphere. Although a street light is not a cosine-emitting point source, the angular pattern of the ”glow” seen around the light gives a qualitative feeling for the PSF, which is just the pattern of the radiance distribution centered on the source.


PIC

Figure 3: Qualitative illustration of the PSF for a street light seen through a foggy atmosphere.

Equivalence of the PSF and BSF

Figures 1 and 2 are drawn to highlight the symmetry between the BSF and the PSF. The principle of electromagnetic reciprocity says that if light follows a particular path from one point A to another point B, then light leaving B in the opposite direction will retrace the original path back to A. This is illustrated by the green arrows connecting the sources and receivers in the two figures. This reciprocity suggests that the BSF and PSF contain equivalent information. Figure 4 shows PSF and BSF measurements from two very similar water bodies obtained by Mertens and Replogle (1977). The instruments were mounted on a 20 m long underwater frame, which allowed a maximum range r between source and detector of approximately 19 m. Although the water IOPs were slightly different for these two measurements, the closeness of the PSF and BSF over several orders of magnitude suggest that the BSF and PSF are numerically equal. Mertens and Replogle stated this equality without proof. Gordon (1994b) started with the radiative transfer equation and used reciprocity to show that the BSF and PSF are indeed numerically equal.


PIC

Figure 4: BSF and PSF at 488 nm measured on two different days in very clear Bahamas water. The range was approximately 19 m. The BSF was measured when the water beam attenuation length was 1c(488) = 6.8m. The PSF was measured on the following day when 1c(488) = 7.0m. Curves extracted from Mertens and Replogle (1977), Figs. 4 and 6.

It is important to understand the difference between the volume scattering function (VSF) and the beam or point spread function. The VSF describes the redirection of light by a single scattering event. The VSF depends only on the scattering angle. The PSF describes the cumulative effects of scattering and absorption as the light travels from the source to the detector. The PSF therefore depends on both the scattering and absorption properties of the medium, and on the distance from the source to the detector.

The PSF and Image Prediction

The PSF plays a fundamental role in the prediction of how an object appears when seen through an absorbing and scattering medium such as water. Consider the image of a shark seen in Fig. 5. Let S(x,y,0) represent the ”bright-dark” pattern of the shark image when seen at distance z = 0, or as seen through a vacuum. Thus for a gray-scale digital image and 8 bit resolution, S = 0 for a black pixel and S = 255 for white pixel. When viewing a particular point (xo,yo) in the image through a vacuum at some distance z from the shark, the detected radiance comes only from point (xo,yo) and has the value S(xo,yo,0). However, when viewing point (xo,yo) in the through a scattering medium, every point in the scene contributes to the radiance seen at (xo,yo). The concentric white circles in the figure represent contours of the PSF(z,𝜃). Point (x1,y1) thus contributes a radiance of S(x1,y1,0)PSF(z,𝜃1), where angle 𝜃1 = tan1(r1z). Likewise point (x2,y2) contributes radiance S(x2,y2,0)PSF(z,𝜃2), where angle 𝜃2 = tan1(r2z). The total radiance at point (xo,yo) is given by summing up the contributions from all points in the image:

S(xo,yo,z) =allxallyS(x,y,0)PSF(xo x,yo y,z)dxdy, (3)

where the PSF argument (xo x,yo y,z) is shorthand for distance z and angle

𝜃(x,y,z) = tan1(xo x)2 + (yo y)2 z .

when the shark is seen from distance z. Thus, given the actual appearance of an object and the PSF of the medium, the appearance of the object as seen through the medium can be computed.


PIC

Figure 5: Illustration of the use of the PSF in image prediction.

Integrals of the form h(xo) =f(x)g(xo x)dx f g are known as convolution integrals. Equation (3) is a 2D convolution. These integrals have the property that the Fourier transform of a convolution is the product of the Fourier transforms of the two functions. This leads to a very important procedure for image analysis. Rather than compute the double integral of Eq. (3) for every point (xo,yo) of the scene, the Fourier transforms of S(x,y,0) and PSF(x,y,z) can be computed, multiplied, and then the inverse transform of the product gives the final image. This topic will be developed on other pages; meanwhile see Goodman (1996). (Incidentally, the term ”convolution” is frequently misused. An integral of the form f(x)g(x)dx is not a convolution; it is just the integral of the product of two functions f and g or, if you wish, of one function weighed by the other. Such integrals do not occur in image analysis and do not have a convenient Fourier transform.)

Finally, note that in a vacuum and with a perfect optical system, the PSF reduces to a 2D Dirac delta function: PSF(xo x,yo y) = δ(xo x)δ(yo y), in which case

S(xo,yo,z) =allxallyS(x,y,0)δ(xo x)δ(yo y)dxdy = S(xo,yo,0).

Thus the image at distance z is the same as at distance 0.

Because of the importance of the PSF in image analysis, considerable effort has been expended to develop models of the PSF as a function of water IOPs. A number of such models are reviewed in Hou et al. (2008) The page on The Lidar Equation shows the fundamental role of the BSF in lidar remote sensing.

Comments for Beam and Point Spread Functions:

Loading Conversation