Recommended readings:
Full spectrum: Multispectral imagery and hyperspectral imagery · UP42
000 Full spectrum_ Multispectral imagery and hyperspectral imagery · UP42.pdf
Spectral Analysis of Images of Plants under Stress using a Close-Range Camera
Reference material: (don’t read)
Lanfri 2010 - Vegetation analysis using remote sensing (master's Thesis)
000 Lanfri-2010 Vegetation analysis using remote sensing.pdf
We would like to motivate the curious reader with the importance of the study of spectra. A “spectrum” (absorption, transmission, fluorescence, Raman excitation, etc.) is fundamentally influenced by the material composition, molecular structure, and atomic energy levels. Hence, spectra can be used to identify or study the properties of materials. Many tests involve searching for the presence vs absence, or strength of, specific spectral peaks associated with some material property.
For the FINCH satellite, the scientific objective is to quantify the level of crop residue over farmland. Crop residue is the left-over biomass from harvest. Too much leftover crop residue could harbor pests, while too little can prevent the ground from retaining rainwater runoff. Crop residue also sequesters (and emits) greenhouse gasses at various points in its lifecycle. In any case, it is important to monitor the amount of crop residue over wide swaths of farmland to assist with sustainable agriculture and land use. To accomplish this, it is necessary to distinguish crop residue from soil and photosynthetic vegetation by their spectral signatures.
While this may seem impossible because everything “looks green-ish” to the naked (RGB) eye, careful attention can be paid to some spectral features associated with plant matter. The figure below illustrates the process of applying hyperspectral data processing to emphasize this difference. Consider the “Original RGB Image” in the middle. Everything looks a different shade of green, and hence one can conclude it is all fields and forests. However, after applying a simple formula to calculate the Normalized Difference Vegetation Index, one generates a scale of how “close” something is to actually being plant matter. Observe that the big dark region in the center and the smaller light green region in the lower right corner of the Original RGB Image get mapped to “-1” on the NDVI scale. This is because these regions actually represent a body of water, with the underlying ground or algae coloring it green. Regions of dried-up grass fields also map into the “negative” NDVI scale.
Our goal with FINCH EYE is to quantify crop residue on farmland via hyperspectral imaging in the Short-Wave Infra-Red range (SWIR, 0.9-1.7um). While detector technology is cheaper for the visible range, there is only a slight difference between healthy and stressed pants from 0.4-0.7um. On the other hand, there are dramatic spectral differences in the Infra-Red and beyond, but this requires more advanced detector technology. An initial mission scoping conducted in early 2022 concluded that there was a shortage of available data in the SWIR range, which will fill the gap in scientific datasets. While the “best” wavelengths for crop residue mapping are in the 2-2.4um range, there are technological and financial difficulties integrating this within a 3U CubeSat form factor.
Figure from: Full spectrum: Multispectral imagery and hyperspectral imagery · UP42
Figure from:
000 isprs-archives-XLVIII-1-W3-2023-63-2023.pdf
While we are still developing the FINCH EYE Hyperspectral Imager, we are concurrently designing a prototype VIS-NIR hyperspectral imager without the strict volume, mass, vibration and temperature requirements of the satellite payload. We would like to bring your attention to a few applications that could benefit from spectra in the 0.4-0.7um (VIS, visible) and 0.6-0.9um (NIR, near infra-red) regions. We will attempt to conduct “functional tests” with our prototype, e.g. to differentiate between different materials, paint dyes, etc. to demonstrate that our prototype works.
: please continue this piece of text, we’ll author it together. I’ve brainstormed the following outline. Before starting, please read the three links I included at the top.
Brief list of hyperspectral imaging applications, one citation per bullet: