This lecture will qualitatively explain the operating principle of the FINCH EYE hyperspectral imager, which is based on a push-broom architecture and employs a GRISM dispersive element. The relationships between various parameters will be explained from a higher level.

The FINCH EYE is a hyperspectral earth imaging payload meant to study crop residue cover over agricultural land. Hyperspectral imaging means that images have both a spatial component (x ,y) and a spectral component (lambda), and this can be visualized with a “data cube”. By studying the differences in spectra of each plot of land, one can distinguish its relative composition between soil, crop residue, and photosynthetic vegetation. The satellite image depicts the target location of an experimental farm, with which UTAT Space Systems has partnered to validate the initial measurements taken by FINCH EYE.

image.png

As one may be aware, camera sensors are two-dimensional, yet the data we are collecting have a third wavelength dimension. In typical color camera sensors, this is accomplished by overlaying an array of colored filters (known as a Bayer matrix) and mapping them to R,G,B pixel values. However, this approach only works well for a limited number of spectral channels and when there is an abundance of signal photons, as it requires smaller pixels and filters out a majority of the photons of the “wrong color”. There are other methods that can be used which involve unravelling the data cube along one or two dimensions by means of scanning, and the push-broom architecture is one of them.

image.png

image.png

The push-broom architecture (depicted above right) images one “stripe” on the ground at a time, with the spectrum being imaged over the orthogonal axis of the camera sensor. Using a regular objective lens or telescope for magnification, the “stripe” is selected with a slit; all other light is blocked. A dispersive element then spreads the white light into different colors along an orthogonal axis, and relay optics are used to then re-image the (x, lambda) data onto the camera sensor. As the satellite travels over the earth, it sweeps across adjacent stripes on the ground, thereby scanning across the y axis. Note that there are no moving parts within the imager, as the scanning is accomplished by the physical motion of the satellite in its orbit.

The FINCH EYE needs to fit within a compact form-factor: only 1.5U of volume are allocated to the payload from the 3U satellite, which significantly challenges the optical design. Hence, some fairly easy performance specifications are expected of the final payload. The following illustration shows what the payload will be comprised of.

Summer 2024 Onboarding.png

It is helpful to think of the spatial (x) and spectral (lambda) axes separately while understanding how the FINCH EYE works. The figures below illustrate how light propagates through the optical system. The top image shows the spectral axis (i.e., the different colors coming from one target on the ground), while the bottom axis shows the spatial axis (i.e., the contiguous ground targets that comprise the stripe currently being imaged).

Ray Trace Diagram.png

We will now explain how all of these components “fit together” and function in order to produce hyperspectral images.

Typically, an objective lens is used to produce a high-quality image with minimum aberration of the target on a camera sensor or photographic film. However, since we do not desire a 2-dimensional aerial view of the earth on the camera, we place a slit at the back focal plane of the objective lens and block the unwanted portion of the image from propagating through. The portion that does pass through, corresponding to a stripe on the ground, is subsequently re-imaged onto the camera sensor with a lens relay. A lens relay is an arrangement of two “lenses” wherein an object placed at the front focal point of the first lens is imaged at the back focal point of the second lens. The rays between the relay lenses are parallel, or collimated. Therefore, aside from incurring some aberrations, the lens relay does not modify the “spatial” axis of the image.

lens relay.png

Lens relay (above) and a classic Czerny-Turner spectrometer (right).

image.png