class: center, title-slide, middle background-image: url("img/CASA_Logo_no_text_trans_17.png") background-size: cover background-position: center <style> .title-slide .remark-slide-number { display: none; } </style> # Remotely Sensing Cities and Environments ### Lecture 9: Synthetic Aperture Radar (SAR) data ### 4/10/2022 (updated: 16/03/2023)
[a.maclachlan@ucl.ac.uk](mailto:a.maclachlan@ucl.ac.uk)
[andymaclachlan](https://twitter.com/andymaclachlan)
[andrewmaclachlan](https://github.com/andrewmaclachlan)
[Centre for Advanced Spatial Analysis, UCL](https://www.ucl.ac.uk/bartlett/casa/)
[PDF presentation](https://github.com/andrewmaclachlan/CASA0023-lecture-9/blob/main/index.pdf) <a href="https://github.com/andrewmaclachlan" class="github-corner" aria-label="View source on GitHub"><svg width="80" height="80" viewBox="0 0 250 250" style="fill:#fff; color:#151513; position: absolute; top: 0; border: 0; left: 0; transform: scale(-1, 1);" aria-hidden="true"><path d="M0,0 L115,115 L130,115 L142,142 L250,250 L250,0 Z"></path><path d="M128.3,109.0 C113.8,99.7 119.0,89.6 119.0,89.6 C122.0,82.7 120.5,78.6 120.5,78.6 C119.2,72.0 123.4,76.3 123.4,76.3 C127.3,80.9 125.5,87.3 125.5,87.3 C122.9,97.6 130.6,101.9 134.4,103.2" fill="currentColor" style="transform-origin: 130px 106px;" class="octo-arm"></path><path d="M115.0,115.0 C114.9,115.1 118.7,116.5 119.8,115.4 L133.7,101.6 C136.9,99.2 139.9,98.4 142.2,98.6 C133.8,88.0 127.5,74.4 143.8,58.0 C148.5,53.4 154.0,51.2 159.7,51.0 C160.3,49.4 163.2,43.6 171.4,40.1 C171.4,40.1 176.1,42.5 178.8,56.2 C183.1,58.6 187.2,61.8 190.9,65.4 C194.5,69.0 197.7,73.2 200.1,77.6 C213.8,80.2 216.3,84.9 216.3,84.9 C212.7,93.1 206.9,96.0 205.4,96.6 C205.1,102.4 203.0,107.8 198.3,112.5 C181.9,128.9 168.3,122.5 157.7,114.1 C157.9,116.9 156.7,120.9 152.7,124.9 L141.0,136.5 C139.8,137.7 141.6,141.9 141.8,141.8 Z" fill="currentColor" class="octo-body"></path></svg></a><style>.github-corner:hover .octo-arm{animation:octocat-wave 560ms ease-in-out}@keyframes octocat-wave{0%,100%{transform:rotate(0)}20%,60%{transform:rotate(-25deg)}40%,80%{transform:rotate(10deg)}}@media (max-width:500px){.github-corner:hover .octo-arm{animation:none}.github-corner .octo-arm{animation:octocat-wave 560ms ease-in-out}}</style> --- # How to use the lectures - Slides are made with [xaringan](https://slides.yihui.org/xaringan/#1) -
In the bottom left there is a search tool which will search all content of presentation - Control + F will also search - Press enter to move to the next result -
In the top right let's you draw on the slides, although these aren't saved. - Pressing the letter `o` (for overview) will allow you to see an overview of the whole presentation and go to a slide - Alternatively just typing the slide number e.g. 10 on the website will take you to that slide - Pressing alt+F will fit the slide to the screen, this is useful if you have resized the window and have another open - side by side.
--- # Lecture outline .pull-left[ ### Part 1: SAR fundamentals ### Part 2: Practical change detection with SAR ] .pull-right[ <img src="img/satellite.png" width="100%" /> .small[Source:[Original from the British Library. Digitally enhanced by rawpixel.](https://www.rawpixel.com/image/571789/solar-generator-vintage-style)]] --- class: inverse, center, middle # Let's recall some of the intro slides.... --- # The two types of sensor .pull-left[ <img src="img/Passive-and-active-sensors-systems-working-principles-24.png" width="100%" /> .small[Source: [Nadhir Al-Ansari](https://www.researchgate.net/figure/Passive-and-active-sensors-systems-working-principles-24_fig2_344464269)] ] .pull-right[ **Active** * Have an energy source for illumination * Actively emits electormagentic waves and then waits to receive * Such as: Radar, X-ray, LiDAR **Passive** * Use energy that is available * Don't emit anything * Usually detecting **reflected** energy from the sun * Energy is in electromagnetic waves... * Such as: Human eye, camera, satellite sensor Sensors can be mounted on any platform. ] --- # Bat-like <img src="img/echolocation.jpg" width="100%" /> .small[Source: [ASU](https://askabiologist.asu.edu/echolocation)] --- class: inverse, center, middle # Now some of the lecture 4 slides...and a few extras --- # SAR floods **Sensor** * Sentinel-1 SAR ENSO phases but this is from Australian La Niña 2022 * trade winds from south america intensity * draw up cool deep waters and increase thermocline * temp difference increases, walker circulation intensifies - feedback loop * more cloud + more rain + cyclones in West Pacific <img src="img/SAR1.jpg" width="35%" style="display: block; margin: auto;" /> .small[Eastern Australia Floods. Source:[brockmann-consult](https://www.brockmann-consult.de/eastern-australia-floods/) ] --- # SAR floods What does the flooded area appear dark in the previous image? -- Calm water (and smooth surfaces) reflects the radar pulses away from the sensor Rough surfaces reflect in all directions - more is sent back to the sensor .small[Source:[NASA](https://nisar.jpl.nasa.gov/mission/get-to-know-sar/overview/#:~:text=Interpreting%20Radar%20Images&text=Regions%20of%20calm%20water%20and,scattered%20back%20to%20the%20antenna.) ] --- # SAR background Synthetic Aperture Radar: * Active sensors * Have surface texture data * See through weather and clouds * Different wavelengths - different applications <img src="img/SAR_bands.png" width="50%" style="display: block; margin: auto;" /> .small[What is Synthetic Aperture Radar?. Source:[NASA Earth Data](https://earthdata.nasa.gov/learn/backgrounders/what-is-sar) ] --- # SAR background .pull-left[ 1. Emit an electromagnetic signal (speed of light) 1. Record the amount of signal that bounces back = "backscatter" But...the Radar is moving... 1. Moves forward (in the azimuth) - longer antenna has a narrower beam and high resolution 1. Sweeps the footprint (swath) Images... 1. Pixels in swath are imaged many times (sweeping and moving) 1. **means** the distance will change and we know exactly how much by phase... 1. We combine these images to make **"synthetic" aperture** ] .pull-right[ <img src="img/Radar_in_motion_diagram.jpeg" width="100%" style="display: block; margin: auto;" /> .small[Get to know SAR. Source: [NASA](https://nisar.jpl.nasa.gov/mission/get-to-know-sar/overview/) ] ] --- #SAR terms .pull-left[ * Photography **aperture** = lets more light in to change focus <img src="img/aperture-rules.jpg" width="100%" style="display: block; margin: auto;" /> .small[What is Aperture? Source:[City Academy](https://www.city-academy.com/news/what-is-aperture-in-photography/) ] ] .pull-right[ * RADAR aperture = the antenna <img src="img/seasat_landing_image-300x300.png" width="100%" style="display: block; margin: auto;" /> .small[What is Synthetic Aperture Radar? Source:[NASA Earth Data](https://earthdata.nasa.gov/learn/backgrounders/what-is-sar) ] ] --- class: inverse, center, middle ## Longer antenna = narrower beam and a higher resolution ## BUT we can't have a long antenna in space.. ## "synthetic" aperture = "synthesize a long antenna by combining signals, or echoes, received by the radar as it moves along a flight track" .small[[Source: NASA get to know SAR](https://nisar.jpl.nasa.gov/mission/get-to-know-sar/overview/) ] --- # SAR polarization .pull-left[ * Also different ploarizations: * orientation of the plane in which EMR waves transmitted.. * "direction of travel of an electromagnetic wave vector’s tip: vertical (up and down), horizontal (left to right), or circular (rotating in a constant plane left or right)." * Single = 1 horizontal (or vertical) * Dual = transmits and receives both horizontal and vertical * HH = emitted in horizontal (H) and received in horizontal (H) ] .pull-right[ <img src="img/polarisation.png" width="100%" style="display: block; margin: auto;" /> .small[Polarization. Source:[Wetland Monitoring and Mapping Using Synthetic Aperture Radar](https://www.intechopen.com/chapters/63701) ] ] --- # SAR polarization Different surfaces respond differently to the polarizations * Rough scattering (e.g. bare earth) = most sensitive to VV * Volume scattering (e.g. leaves) = cross, VH or HV * Double bounce (e.g. trees / buildings) = most sensitive to HH. <img src="img/SARPolarization.jpg" width="100%" style="display: block; margin: auto;" /> .small[What is Synthetic Aperture Radar?. Source:[NASA Earth Data](https://earthdata.nasa.gov/learn/backgrounders/what-is-sar) ] --- # SAR background Scattering can change based on wavelength Further penetration then the volume scattering will change <img src="img/SARtree_figure2.jpg" width="100%" style="display: block; margin: auto;" /> .small[What is Synthetic Aperture Radar?. Source:[NASA Earth Data](https://earthdata.nasa.gov/learn/backgrounders/what-is-sar) ] --- # SAR background .pull-left[ * Wavelength of SAR can change application <img src="img/SAR_bands_NASA.png" width="100%" style="display: block; margin: auto;" /> ] .pull-right[ * Remember this is on the electromagnetic spectrum (EMR) <img src="img/SAR_bands_EMR.jpg" width="100%" style="display: block; margin: auto;" /> ] .small[What is Synthetic Aperture Radar?. Source:[NASA Earth Data](https://earthdata.nasa.gov/learn/backgrounders/what-is-sar) ] --- # Amplitude (backscatter) and phase .pull-left[ A SAR signal has both **amplitude** (backscatter) and **phase** data ** Backscatter (amplitude)** * Polarization * VV = surface roughness * VH = volume of surface (e.g. vegetation has a complex volume and can change the polarization) * Permativity (dielectric constant) - how **reflective** is the property which means **reflective back to the sensor**. Water usually reflects it off elsewhere * The return **value**, also remember the band (wavelength) ] .pull-right[ <img src="img/amplitude_example.png" width="100%" style="display: block; margin: auto;" /> * Wind makes the water move and reflect back to the sensor (under VV) .small[NASA Data Made Easy: Part 2- Introduction to SAR. Source:[NASA Earth Data](https://www.youtube.com/watch?v=Zfn7P395O40) ] ] --- # Amplitude (backscatter) and phase .pull-left[ A SAR signal has both **amplitude** (backscatter) and **phase** data **Phase** * Location of wave on the cycle when it comes back to the sensor ] .pull-right[ <img src="img/phas_shift.png" width="100%" style="display: block; margin: auto;" /> .small[InSAR. Source:[Pascal Castellazzi](https://www.researchgate.net/figure/Principle-of-the-InSAR-techniques-the-phase-difference-observed-by-comparing-two-SAR_fig1_342916517) ] ] --- # InSAR InSAR is mapping for ground movement detected through this **phase shift** Movement is shown through an **interferogram** <img src="img/phase_shift.jpg" width="70%" style="display: block; margin: auto;" /> .small[InSAR. Source:[GeoScience Australia](https://www.ga.gov.au/scientific-topics/positioning-navigation/geodesy/geodetic-techniques/interferometric-synthetic-aperture-radar) ] --- # InSAR The field of this work is called **interferometry** > Interferometric synthetic aperture radar (InSAR) techniques combine two or more SAR images over the same region to reveal surface topography or surface motion <img src="img/Interferogram.jpeg" width="60%" style="display: block; margin: auto;" /> .small[An example of an interferogram from the airborne UAVSAR instrument obtained over the San Andreas Fault in California. The fault line can be identified in the upper half where the pink and yellow colors meet. This color change in the “fringe” is caused by surface movement the occurred between the observation dates of the two polarimetric images combined to produce the interferogram. Credit: NASA/JPL-Caltech] .small[Source:[NASA](https://nisar.jpl.nasa.gov/mission/get-to-know-sar/overview/#:~:text=Interpreting%20Radar%20Images&text=Regions%20of%20calm%20water%20and,scattered%20back%20to%20the%20antenna.)] --- # Differential Interferometric Synthetic Aperture Radar (DInSAR) Phase shift might come from topography (hills etc) if this is the case (it might not be) then we can remove the effect and this is termed **Differential Interferometry (DInSAR)** **remove the effect of natural elevation** Conceptually similar to DSM, DEM and DTM... <img src="img/The_difference_between_Digital_Surface_Model_(DSM)_and_Digital_Terrain_Models_(DTM)_when_talking_about_Digital_Elevation_models_(DEM).svg.png" width="70%" style="display: block; margin: auto;" /> .small[DSM and DEM. Source:[Wikimedia Commons](https://commons.wikimedia.org/wiki/File:The_difference_between_Digital_Surface_Model_%28DSM%29_and_Digital_Terrain_Models_%28DTM%29_when_talking_about_Digital_Elevation_models_%28DEM%29.svg) ] --- # InSAR or DInSAR * SAR = active sensor, see through clouds, records energy reflected back * InSAR = used for DEMs, converting phase different to relative height * DInSAR = changes between two images in time. Looking at movement of land (uplift or sinking) with topography removed (using a DEM) --- # SAR data processing .pull-left[ Each pixel is a real/imaginary number know as I and Q, meaning an in-phase(I) and quadrature pair (Q), this comes from ** electrical engineering** To create a visible image (amplitude) it is **detected** = the Ground Range Detection produce in GEE: * square root of the sum of squared from the I and Q values in the Single Look Complex (SLC) product * This makes intensity (backscatter) Values can be stored in power, amplitude or dB.... ] .pull-right[ <img src="img/product_tree.png" width="100%" style="display: block; margin: auto;" /> .small[The Tree of Processing Options. Source:[ICEYE](https://iceye-ltd.github.io/product-documentation/5.0/productFormats/introduction/) ] ] --- class: inverse, center, middle ## In GEE ## Only amplitude (backscatter) data is available.. ## To use phase data we need to use SNAP (not considered here) --- # GEE SAR data * Notice that it is logged - why? * Show the full range of values * To get non logged data - why? * Undertake calculations ```r ee.ImageCollection('COPERNICUS/S1_GRD_FLOAT') ``` <img src="img/SAR_GEE.png" width="70%" style="display: block; margin: auto;" /> --- # SAR data values Sentinel-1 Radiometric Terrain Corrected (RTC) data is typically provided on a **power** scale... ** power scale **, **RAW data** * These values are low and close to 0 * means that bright areas are skewed (you can't see any difference) * good for statistics / analysis, poor for visualisation ** amplitude scale ** * square root of the power scale values * brighter darker pixels and darken bright pixels - narrows range * good for visualisation + log changes (see later slides) ** dB scale **, **in GEE** * multiplying 10 times the Log10 of the power scale values * good for identifying differences in dark pixels (e.g. water) * not great for visualistion can be "washed out" * not useful for statistical analysis (log scale) .small[Introduction to SAR. Source:[Alaska Satellite Facility](https://hyp3-docs.asf.alaska.edu/guides/introduction_to_sar/#introduction-to-sar) ] --- class: inverse, center, middle ## SAR data values ### In a paper says they used SAR data - which one? --- # Notes / examples ### Notes * Somewhat similar to the theory of atmospheric scattering * Ground that is under the sensor is not homogeneous * SAR scatter may interfere with each other = GRAINY looking image * This is termed salt and pepper and can be smoothed out * This is done by averaging multiple images. ### Examples * See [the scale conversion tool by Heidi Kristenson, ASF](https://storymaps.arcgis.com/stories/73b6af082e1f44bca8a0c5fb6bf09f37) * The [power of SAR seeing through clouds](https://storymaps.arcgis.com/stories/2ead3222d2294d1fae1d11d3f98d7c35) --- # GEE SAR data...a closer look <img src="img/SAR_GEE.png" width="100%" style="display: block; margin: auto;" /> --- # Questions to help us decide We need to consider... **What are we trying to detect, roughness or volume of a material...** * Polarization * VV = surface roughness * VH = volume of surface (e.g. vegetation has a complex volume and can change the polarization) **What are we trying to do...** * Type of data * power scale = analysis * amplitude scale = visualisation * dB scale = dark pixel differences **what part of earth are we looking at...** * lot's of water? --- class: inverse, center, middle ## How to we identify change then -- ## At the moment there seems to be no consistent approach to this question... --- # Identifying change... As we now know enough about SAR data...how do we establish changes between time? ** subtract images ** * Sometimes with optical data we can subtract images to determine differences * This is not a good idea with SAR data > So difference pixels in bright areas will have a higher variance than difference pixels in darker areas [(Mort Canty)](https://developers.google.com/earth-engine/tutorials/community/detecting-changes-in-sentinel-1-imagery-pt-2) > image differencing technique is not adapted to the statistics of SAR images and non robust to calibration errors [(Vaiyammal, 2018)](https://www.ijert.org/change-detection-on-sar-images) --- # Identifying change... ** (original) ratio images ** * This is just the images divided.... * `\(ratio = \frac{image2}{image1}\)` ** Improved ratio (IR) ** * Designed to give changed pixels more difference * `\(IR= 1 - \frac{(minI_1(x), min{I_2}(x))}{(maxI_1(x), max{I_2}(x))}\)` * where `\(minI_1(x), min{I_2}(x)\)` is the minimum value between the two images, per pixel .small[It is a misunderstanding that log ratio outperforms ratio in change detection of SAR images. Source:[Zhuang et al. 2019](https://www.tandfonline.com/doi/pdf/10.1080/22797254.2019.1653226?needAccess=true) ] --- # Identifying change... ** mean ratio images ** * `\(X_m(i,j)= 1 -min( \frac{(u_1(i,j)}{u_2(i,j)},\frac{(u_2(i,j)}{u_1(i,j)})\)` * where `\((u_1(i,j)\)` and `\((u_2(i,j)\)` are the mean values from a neighbourhood in image 1 and 2 ** log ratio images ** * `\(log ratio = ln\frac{image2}{image1}\)` ** improved ratio log ratio images ** * `\(IRL= ln(1 - \frac{(minI_1(x), min{I_2}(x))}{(maxI_1(x), max{I_2}(x))})\)` --- class: inverse, center, middle ## Which is best? -- ## Well, hard to say... -- ## How do we know / how do we test these ? --- # Testing .pull-left[ * In their paper Zhuang et al. (2019) use three existing SAR datasets * Berne, Ottawa and FengFeng * Each of the datasets has a reference map * Some have been manually created, others aren't specified * Uses an **ROC curve** (better when closer to upper left) ] .pull-right[ <img src="img/zhuang_paper.png" width="100%" style="display: block; margin: auto;" /> .small[It is a misunderstanding that log ratio outperforms ratio in change detection of SAR images. Source:[Zhuang et al. 2019](https://www.tandfonline.com/doi/pdf/10.1080/22797254.2019.1653226?needAccess=true) ] ] --- # ROC reminder (Receiver Operating Characteristic Curve) .pull-left[ * Change the value threshold for the what is labelled as change or not... * Compare this to the testing data that was manually provided * Create the True Positive (y axis) and False Positive rates (x axis) * True positive = predicted change and is change * False positive = predicted change but not change * rates are... `\(TPR = \frac{number of true positives}{number of true positives + number of false positives}\)` `\(FPR = \frac{number of false positives}{number of false positives + number of true negatvies}\)` ] .pull-right[ <img src="img/matrix.PNG" width="100%" style="display: block; margin: auto;" /> .small[Source:[Barsi et al. 2018 Accuracy Dimensions in Remote Sensing](https://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XLII-3/61/2018/isprs-archives-XLII-3-61-2018.pdf)] ] --- class: inverse, center, middle ## What's the problem with these approaches? ## What aren't we making use of? -- ## We aren't using the image collection data ## Just two images in time --- # Identifying change...through image collections There are a few ways to do this.... 1. Using common statistics and tests we have seen 2. Using specific published methodologies specific to SAR, such as that by Canty et al.(2020), which appears in 3. Fusing imagery to optical data and then classifying --- # Identifying change...statistical tests In practical 1 we briefly considered a t-test for comparing the difference between Sentinel and Landsat data: Here we can apply similar logic to test for changes between image **collections**.... `\(t = \frac{\overline{x1}-\overline{x2}}{\sqrt(S^2(\frac{1}{n1}+\frac{1}{n2}))}\)` Where: * `\(\overline{x1}\)` = mean of pre image collection * `\(\overline{x2}\)` = mean of post image collection * `\({n1}\)` = observations in group 1 (and 2) * `\(s2\)` = pooled standard deviation (another formula) --- # Identifying change...statistical tests .pull-left[ However, a **possible** problem here is that the t-test assume a normal distribution * In the SAR tutorials on [GEE by Mort Canty](https://developers.google.com/earth-engine/tutorials/community/detecting-changes-in-sentinel-1-imagery-pt-1) it suggested that the distribution of SAR data follows a gamma distribution.This means that it is skewed...**however**... * Andy Field states the normal distribution refers to the difference image not the original images (before and after). In the tutorial by Canty this is done on the ratio image... * Nevertheless Canty goes on to demonstrate a continuous change detection algorithm for SAR...which can detect dates for changes in the values. ] .pull-right[ <img src="img/mort_canty_gamma.png" width="100%" style="display: block; margin: auto;" /> .small[Histogram of an single SAR image. Detecting Changes in Sentinel-1 Imagery (Part 1). Source:[Canty 2022](https://developers.google.com/earth-engine/tutorials/community/detecting-changes-in-sentinel-1-imagery-pt-1)] ] --- class: inverse, center, middle ## But do we need that level of complexity ? -- ### Consider an event and a complete image collection (just all the images for the year of the event) ### If change has occured in a pixel we'd expect much higher standard deviation (over time) ###If no change has occured lower standard deviation ### We can pick (threshold) the pixels that have changed.... --- # Recall thresholding and ROC? Receiver Operating Characteristic Curve (the ROC Curve) .pull-left[ * Receiver Operating Characteristic Curve (the ROC Curve) * Originates from WW2, USA wanted to minimize noise from radar to identity (true positives) and not miss aircraft...minimizing false positives (clouds) * **Changing the threshold value of classifier** will change the True Positive rate * probability that a positive sample is correctly predicted in the positive class...planes predicted to be planes ] .pull-right[ * False positive rate: The probability that a negative sample is incorrectly predicted in the positive class...predicted planes...but are clouds * Maximise true positives (1) and minimise false positives (0) <img src="img/ROC_planes.png" width="100%" style="display: block; margin: auto;" /> .small[Source:[MLU-EXPLAIN](https://mlu-explain.github.io/roc-auc/)] ] --- # Recall thresholding and ROC? ## Note * The previous slide was from week 7 - check back there for more information. * Here we would need some validation data of change --- # Identifying change...image fusion .pull-left[ Remember from the [video in week 3](https://youtu.be/a4dW5EWbNK4?t=161) * Decision level fusion * Used radar and optical as separate layers (just appending a band to the stack) * Then classify * Object level fusion * Need to make objects from the imagery * Combine optical and SAR (as in decision level) * Classify * Image fusion * Pixel values are combined from both optical and SAR * **New** pixel values ] .pull-right[ <img src="img/pixel_object_fusion.jpg" width="100%" style="display: block; margin: auto;" /> .small[Overview of the advantages and drawbacks of the most common multispectral-radar SRS image fusion techniques, as well as examples for open-source software to implement them. Source:[Henrike Schulte to Bühne and Nathalie Pettorelli, 2017](https://besjournals.onlinelibrary.wiley.com/doi/10.1111/2041-210X.12942) ] ] --- # Identifying change...image fusion .pull-left[ * From the figure in past sessions we have seen * Principal component analysis * Object based image analysis * High pass filtering, same concept with different data here * Segmentation * We haven't covered: * Intensity hue saturation transformation * Wavelet transformation ] .pull-right[ <img src="img/pixel_object_fusion.jpg" width="100%" style="display: block; margin: auto;" /> .small[Overview of the advantages and drawbacks of the most common multispectral-radar SRS image fusion techniques, as well as examples for open-source software to implement them. Source:[Henrike Schulte to Bühne and Nathalie Pettorelli, 2017](https://besjournals.onlinelibrary.wiley.com/doi/10.1111/2041-210X.12942) ] ] --- # Identifying change...image fusion .pull-left[ * An example of image fusion from the Alaska Satellite Facility... * Take backscatter from SAR * Take optical image of same area * Convert optical from Red, Green Blue (or other mix of bands) to Intensity, Hue and Saturation * Replace the intensity with SAR data * Convert back to RGB Jensen covers this on page 171 ] .pull-right[ <img src="img/fused_image_ASF.jpg" width="100%" style="display: block; margin: auto;" /> .small[Figure 11: Image fusion result of SAR and optical imagery. Source:[ASF](https://hyp3-docs.asf.alaska.edu/guides/rtc_product_guide/#change-detection-using-rtc-data) ] .small[[ASF has useful SAR guide](https://hyp3-docs.asf.alaska.edu/guides/rtc_product_guide/#change-detection-using-rtc-data)] ] --- # Identifying change...image fusion A few things to note... * IHS and HSV are similar but not the same * GEE only has the spectral transformation rgsToHsv which is different to IHS. See: [Kamble et al. (2016)](https://www.semanticscholar.org/paper/HSV%2C-IHS-and-PCA-Based-Image-Fusion%3A-A-Review-Kamble-Maisheri/3d36b4232e2b9b8806ff92abd18a6f5ed46918d2). You may need to Google the paper... * But GEE also has the reverse hsvtoRGB... * There are online converters for both [RGB to IHS and IHS to RGB](https://www.had2know.org/technology/hsv-rgb-conversion-formula-calculator.html) * You could take these formulas and implement them in GEE * Or supplement the value in the GEE function for the SAR data.... --- class: inverse, center, middle # What should i use -- ## first what are you trying to achieve? -- ## work backwards -- ## no established way of doing this -- ## whatever works best with reasoning --- # SAR image "multitemporal colour composite SAR image, rice growing areas in the Mekong River delta, Vietnam 1996." .pull-left[ "Three SAR images acquired by the ERS satellite during 5 May, 9 June and 14 July in 1996 are assigned to the red, green and blue channels respectively for display. The colourful areas are the rice growing areas, where the landcovers change rapidly during the rice season. The greyish linear features are the more permanent trees lining the canals. The grey patch near the bottom of the image is wetland forest. The two towns appear as bright white spots in this image. An area of depression flooded with water during this season is visible as a dark region." ] .pull-right[ <img src="img/multitemporal colour composite SAR image.png" width="100%" style="display: block; margin: auto;" /> ] .small[SAR Images. Source:[CRISP](https://crisp.nus.edu.sg/~research/tutorial/sar_int.htm) ] --- # Me on SAR > Although SAR images over urban areas provide low quality images due to problems associated with radar imaging in such an environment (i.e. multiple bouncing, layover and shadowing), SAR texture measures can provide valuable information in discerning urban areas (Dell’Acqua et al., 2003; Zhu et al., 2012). Isolated scattering of residential areas and crowded backscatters of inner city high density areas permit classification refinement, thus textural measures such as those descried within the spatial domain can aid identification of alternative urban forms (Zhu et al., 2012). >However, the lack of freely available SAR data that temporally coincides with other satellite imagery (e.g. Landsat) frequently precludes extensive use --- # Summary 1 * SAR is an active sensor * Records the **amplitude** (backscatter) and **phase** data (location of the wave cycle) * Consider: * polarization (e.g. VV = surface roughness) * Permativity (dielectric constant) - how **reflective** is the property which means **reflective back to the sensor**. Water usually reflects it off elsewhere * Wavelength (or band) * In GEE we **just have amplitude (backscatter)** not phase * Data comes in three units * power scale (RAW SAR) = statistics * amplitude = visualisation * dB scale = seeing differences (default in GEE) --- # Summary 2 * SAR is very useful as it can **see through clouds** unlike optical sensors (e.g. Landsat) * But how can it be useful in our analysis? * Change between two images (e.g. ratio or log ratios) But this doesn't use the high temporal nature of it! We can see the variance over time through: * t-tests * standard deviation Or we can fuse our SAR data to optical data * Principal component analysis * Object based image analysis * Intensity fusion