In order to select the analysis method, we should weigh different algorithms efficiency with respect to our satellites capabilities
Linear algorithms:
Percent Purity Index (PPI)
Spectral Angle Mapper (SAM)
Notes on SAM:
Envi whitepaper: https://www.nv5geospatialsoftware.com/Portals/0/pdfs/Confirmation/Hyperspectral-Whitepaper.pdf
- Data limitations:
- Pushbroom (ours)
- “Raw” data that is obtained is corrected for satellite motion errors, is our data corrected?
- Raw Digital number values vs Calibrated to radiance??
- Digital number - raw pixel values that aren’t calibrated to anything useful
- Radiance - self explanatory, ENVI tool “radiometric calibration” to correct for radiance, assuming parameters are included in metadata (has to be in right units)
- Seems straightforward, but it also says that corrects for neighbouring pixels radiance, but seems like chicken egg sorta thing?
- Reflectances, atmospheric and surface based
- Atmospheric based on water and stuff in the atmosphere, varies hourly to daily kinda? Which source do we use; ig not too much variation if water vapour is concentrated in lower atmosphere
- https://www.ncl.ac.uk/tcmweb/bilko/module7/lesson3.pdf and http://geography.middlebury.edu/data/gg1002/Handouts/ComputingReflectanceFromDN.pdf
- Angle between sun and plane being measured and satellite , brightness?
- But it also says radiance inversely proportional to sun earth distance; not really dependent on angle (since cos of really small angle equals one ish)
- Irradiance != radiance apparently, radiance is whatever reaches the satellite, irradiance is earth to sun radiation (thought it was like a flammable = inflammable sort of thing)
- Reflectance = radiance/irradiance - which also includes atmospheric?
- Overpass horizontal angle? Different in winter and summer (although ig crop residue is only really during summer) same timezone so doesn’t really matter
- “The spectral signature of a habitat (say seagrass) is not transferable if measured in digital numbers. The values are image specific - i.e. they are dependent on the viewing geometry of the satellite at the moment the image was taken, the location of the sun, specific weather conditions, and so on. It is generally far more useful to convert the DN values to spectral units” - that makes sense
- Radiation passes through atmosphere twice - different forms of absorption though?
- Spatial resolution - smallest object that can be detected
- Larger field less detail obviously
- Water-heavy bands get removed - didn’t we have some issue with that?
- Wouldn’t water heavy atmospheric conditions also imply clouds, so we wouldn’t be able to see through anyways?
- Select own “bad” bands, or good bands which masks bad ones
- Have to set a “dataignore” value
- Spectral angle mapper in reflectance, NOT radiance (which means irradiance value is required)
- QUAC, converts radiance to surface reflectance without other data (only from pixel), easiest tool
- Works best in uniform illumination, so no clouds (answers above cloud question)
- FLAASH - uses the MODTRAN thing, finds water vapour values, has errors called “artifacts?” most accurate though
- Empirical methods, not most accurate
- Needs field reflectance spectra and manually identify regions, seems a little time consuming
- IAR - takes an average, best for uniform pixels, which we are not
- Other ones also don’t seem super useful
- Data dimensionality
- Bands have overlapping data
- Water absorption bands then overlap? How much “bleed” is there (idk the actual name)
- Data transforms; used to reduce noise and dimensionality
- PCA and minimum noise fraction (MNF, methods to reduce to lowest possible dimensionality without losing data
- Machine learning, does not require dimensionality reduction because theyre supposed to discard data already
- MNF is better method, separates noise, easier to find purer endmembers
- Estimates noise, “shift difference” method?
- To improve estimate, if possible have a reference pure endmember really close to where relevant other stuff is
- Has to have more pixels than bands (why? For variance maybe?)
- “Eigenvalues approach 1”, significance of >1 eigenvalues? Are they just irrelevant
- 2 different scenarios of using ENVI
- Either determining types of endmembers (like percent composition) or locations
- Spectral libraries are probably relevant, connected to ENVI already, may need our own
- Ensure scale and units match
- Target detection? Seems kinda hard to detect exact target crop residue with the resolution of a football field, may be missing smth
- Background spectra, for like edge of fields or if theres a house or smth, do/can we subtract it from the image beforehand to just get the field?
- Spectral angle mapping
- Smallest angle difference
- Spectral Angle Mapper Algorithm for Remote Sensing Image Classification
- Based on assumption that one pixel is one type of thing
- Spectrum is vector in n dimensional space, where n is number of bands
- Thematic information extraction - supervised classification??
- Training sites, “hard classification” because only assigned to one class even if clearly belongs to other classes
- Dark spots, higher chance of being in class
- SAM uses only direction, size doesn’t matter so brightness/gain is sorta irrelevant (minimum detection illumination though?)
- kernel spectral angle mapper
- SAM is good bc its simple, but it only considers second order angle dependencies?? (something to do with diffraction?)
- Math is way above my head
- Works better for non linear cases, which is what we have?
- EFFECTIVENESS OF SPECTRAL SIMILARITY MEASURES TO DEVELOP PRECISE CROP SPECTRA FOR HYPERSPECTRAL DATA ANALYSIS - example comparing accuracy of SAM to other stuff, not super accessible or useful
- https://www.mathworks.com/help/images/ref/sam.html#mw_0ab1df07-88e8-404c-bd06-54cf363ee54f seems kinda useful
- improvement on sam (SCM)
- SAM uses absolute value; cannot distinguish between negative and positive correlation
- Pearson correlation coefficient subtracts mean value, standardizes value to provide better estimates
- If reference spectra opposite, then SAM may output high correlation
- Pearsonian correlation can eliminate “shading effect” (stretching?)
- SCM goes -1 to 1, whereas SAM is 0-1 (i guess in n-d vector space cos can’t be negative?
- Shading effect? I thought SAM didn’t rely on intensity?
- short wave IR spectral mapping
- machine learning for spectral analysis
- Seems kinda neat
- Easier to detect “change in” spectrum, so over time
- example using SAM for land degredation mapping
- comparing error metrics
- https://www.jswconline.org/content/jswc/71/5/385.full.pdf
- Seems pretty relevant
Non-linear algorithms:
Spectral Information Divergence (SID)
Sources:
Spectral Information Divergence for Hyperspectral Image Analysis [LINK]
Assessment of Different Spectral Unmixing Techniques on Space Borne Hyperspectral Imagery [LINK]
Supervised Classification Approaches to Analyze Hyperspectral Dataset [LINK]
Technical breakdown of SMA in a 50 page chapter SMA textbook chapter