For the past three years, DigitalGlobe’s Geospatial Big Data platform (GBDX) has been making it simple and fast (and fun!) for the geospatial industry to apply continent-scale machine learning and information extraction processing to the world’s best commercial high-resolution optical satellite imagery.

An ecosystem of innovators from across the globe are bringing their expertise in image processing to the satellite content in order to extract valuable data that is helping solve previously unimaginable challenges.

Despite all of the GBDX ecosystem’s achievements, we have not been able to see through clouds, day or night…until now!

Through our partnership with MDA, we’re pleased to announce that we have added RADARSAT-2 synthetic aperture radar (SAR) content and tools on GBDX. GBDX’s massive-scale content library and cloud-based processing now includes SAR content. But why did we do this?

What is SAR?

Whereas traditional electro-optical EO imagery depends on the sun’s energy to illuminate objects for a camera’s lens or telescope to focus on its digital detector, SAR sensors bring their own source of illumination – radar energy. Whether it’s weather, air traffic control, or SAR sensor radar, radar energy is transmitted towards its target, and reflections of that radar energy are detected and measured. Because of its properties, radar energy is unaffected by clouds or darkness, meaning it transmits and reflects the same regardless of cloud cover or time of day. Also because of radar energy’s properties, some materials (like metallic structures) strongly reflect the signals directly back to the sensor, while others either scatter (like trees) or reflect away (like water) the signals.

SAR sensors are specially designed to transmit millions of sequential radar pulses, capture the properties of the millions of reflections, and then (through lots of complicated math) assemble these pulse reflections into ‘images’. The net result is a synthetically generated ‘view’ of the earth with radar-reflective objects registering as bright and radar-absorptive or -scattering objects registering as dark.

So why does it matter that it’s on GBDX? As anyone that’s ever looked at SAR data can attest, SAR sensor outputs are inherently less intuitive for people to understand than EO imagery. And SAR’s greatest power lies in many of the non-intuitive aspects of the data. In other words, you want machines, not people, that are trained to understand radar (and are really good at math) to be extracting information from SAR. You also want the same massive-scale access to SAR data and processing to solve your toughest location intelligence problems. That’s exactly what GBDX was made for.


SAR “imagery” (left) compared to same-scene EO imagery (right)

Now comes the really powerful part: massive-scale data access and processing for both EO and SAR in one place on GBDX. In each use case below, 10-50X efficiencies in manpower usage (without personnel fatigue) over traditional analysis techniques are anticipated.

Use Case #1: Explore country-scale forest canopy for illicit clearing.

Using GBDX, users define the country or region for monitoring. GBDX is then applied to discover illicit clearing through the following process:

  1. GBDX finds all SAR content from the past 12 months in the area of interest, already resident on GBDX, broken up by 125km X 125km RADARSAT-2 ‘scene’ pairs
  2. Using MDA’s RADARSAT-2-optimized change detection algorithm, the scene pairs are processed in parallel, allowing all pairs to be processed within minutes.
  3. MDA’s change detection algorithms generate change polygons, which are immediately posted to GBDX’s Vector Services, which are subsequently retrieved by GBDX to define candidate areas of illicit activity for further investigation.
  4. Cueing from these candidate area polygons, GBDX automatically searches the DigitalGlobe archive for recent clear imagery within the polygons, retrieves 30-50cm, multi-spectral EO imagery, and triggers image pre-processing (including atmospheric compensation to minimize the effects of atmospheric moisture and aerosols) as well as land, water, and man-made feature extraction processing.
  5. Vector polygon results are once again posted to GBDX’s Vector Services for immediate retrieval through geospatial tools like ArcMap or, DigitalGlobe’s AnswerFactory, or via API access to download GeoJSON results directly. ArcMap or AnswerFactory users also have the option to view the EO imagery from which the land features were extracted for immediate visual validation.
  6. Analysts review automatically-cued results to draw final conclusions before taking action on validated results.

Amplitude Change Detection (ACD) extra-fine results between June 6, 2016 and June 30, 2016 over Murmansk: Olenya Bay. The top images are the two images from the RADARSAT-2 imagery stack used for ACD. The bottom image is the resulting change detection map.

EO visual verification

Use Case #2: Identify regional military structures and activity.

In GBDX, users define regions where potentially hostile military encampments may arise.

  1. GBDX discovers all SAR content from matching-geometry SAR collections already resident on GBDX.
  2. GBDX automatically invokes MDA’s change detection algorithm to parallel process all relevant SAR scene pairs.
  3. Additionally, GBDX invokes ‘bright spot’ detection algorithms to identify radar-reflecting man-made structures in previously ‘dark’ areas.
  4. Resulting polygons are once again stored in GBDX Vector Services and immediately used to cue DigitalGlobe EO image retrieval and processing. This time, GBDX invokes EO-based object detection leveraging a variety of GBDX ecosystem artificial intelligence and pattern-based algorithms to characterize the number of buildings and vehicles present in the area of potential military significance.
  5. Vector polygon results are retrieved by analysts in ArcMap, AnswerFactory, or other user interfaces, facilitating visual confirmation with the source EO imagery. Once again, analysts focus their efforts on automatically cued results to draw final conclusions before taking action on the validated results.



SAR “bright-spot” detection algorithm outputs showing land structures (top), ships (middle) and aircraft (bottom).



EO ship detection leveraging SpaceKnow ship detection (top) and OpenSpaceNet aircraft detection (bottom), both providing visual verification with EO data.

Use Case #3: Humanitarian Assistance & Disaster Response

In GBDX, users define regions of a recent natural or man-made disaster.

  1. GBDX discovers all SAR content from matching-geometry SAR collections already resident on GBDX
  2. GBDX launches SAR change detection processing, generating change polygons indicating potential areas of damage.
  3. Resulting polygons are stored and immediately used to cue DigitalGlobe EO image retrieval and processing for automated land use land cover extraction as well as crowdsourced damage assessment.
  4. DigitalGlobe provides a channel for the public’s outpouring of support for disaster victims by launching a Tomnod crowdsourcing campaign, during which thousands of amateur participants identify and validate damaged buildings, flooded areas, and landscape changes. This, combined with automated extraction algorithm results, rapidly provides an assessment where help is needed most.
  5. Results are stored in Vector Services and made available for viewing and further analysis in ArcMap, AnswerFactory, or other user interfaces to draw final conclusions before taking action on the validated results.




Flooding highlighted by MDA SAR Change Detection algorithm outputs (top). Automated GBDX processing on EO and OpenStreetMap data provides further quantification for levels of damage within affected areas (bottom). Tomnod crowdsourcing activates the public crowd to identify individual areas of damage within 24 hours of campaign start.

Interested in learning more about automated change detection on GBDX? See: RADARSAT-2 Automated Change Detection On GBDX: How It Works

See possibilities for solving your toughest problems?

Contact us