Spatial resolution is the core way in which we benchmark satellite imagery. Spatial resolution defines how far you can “zoom in”, and how small of an object you can see. Plenty of free Earth images can show you the arc of a coastline or the bounds of a city. However, many uses require a level of detail that allows you to understand demographic and economic trends to make smart decisions. If you’re looking for a platform that can visualize roads, count cars, or measure containers, you need a closer look.
Resolution is determined by a simple equation: the larger the optics and the lower the sensor, the higher the resolution. However, higher resolution does not always equate to “better.” Different use cases require different data sets.
In order to get a sensor into a fixed orbit, it has to be at minimum height above the Earth. As a result, height is not something we can adjust by much in this equation. That leaves us with the focal length and the actual size of the optics that can be adjusted. We’re in a similar situation to sideline sports photographers who carry huge $20,000 lenses which can capture player details from hundreds of feet away. While the iPhone has become one of the most ubiquitous cameras in the world, it does not serve every purpose.
The National Imagery Interpretability Rating Scale (NIIRS) is used by imagery analysts to define the quality of aerial imagery from different imaging systems. The scale ranges from >900 cm (1 on the NIIRS scale) where a large industrial facility could be identified, to <10 cm (9 on NIIRS scale) where the level of detail discernible is incredibly high but only commercially available from aerial platforms, which have limited coverage areas and are not available in many countries.
The sensor in NASA’s Landsat is a fantastic sensor that’s been imaging the planet for decades. At 15 meters (or 1,500 cm) resolution, it is best used for identifying landscape scale change as opposed to identifying individual objects. Between Landsat 7 and Landsat 8, a new image of the entire planet is taken every eight days.
The idea with NIIRS is that it helps determine the level of resolution needed to support the task at hand. DigitalGlobe imagery is rated between 5 and 5.7 on the NIIRS scale, representing the highest spatial resolution available from commercial satellites today. At 50 cm you can identify different automobiles as sedans or station wagons. Does that matter? It depends on what your need is; it’s all about aligning the correct data set for your problem.
Most commercial sources of Earth imagery resolve between 2 and 5 meters per pixel. But many use cases require the user to be able to zoom in further. DigitalGlobe’s imagery resolves between 30 and 50 cm per pixel. Through our telephoto-style lens, there’s a lot more to see. You can distinguish between a glass roof and an empty lot or identify types of ships by their munitions and antenna arrays. You can count the number of vehicles parked at a dock–you can even determine which of those vehicles are sedans, station wagons, and SUVs.
This is a photo of a port off the coast of Barcelona, taken by NASA’s Landsat. The image was taken at 15 meter spatial resolution: 1 pixel equals 15 meters. In this image, there are a few significant features: the coastline is clearly visible, as are the marina, ports, major roads, and buildings.
When you look at this same location taken at 2.5 meters per pixel, you can see the incredible increase in visual information. In this image, taken with a Planet Labs Dove Now you can identify individual buildings to quantify urban density. You can identify road networks and compare with previous images to see where roads have changed. But, at DigitalGlobe, many of our use cases require us to zoom in a bit further.
Similar to the digital zoom on a camera, attempting to zoom in closer than the optics support does not resolve more information, it just makes your pixels bigger. Now, let’s look at the same location with an image taken at 30 cm spatial resolution, by the DigitalGlobe Worldview-3 sensor launched in 2014.
The difference is clear. From this view, you can identify ship types. You can count a variety of objects: cranes, shipping containers, vehicles parked at the dock, parking spaces. Our notion of big data is transforming the pretty picture into the solutions you can extract from it.
More examples of resolution at work can be found in this blog post published by DigitalGlobe’s Dr. Walter Scott in February, 2016.
Earth Imaging Basics Series
- Earth Imaging Basics: Imagery Acquisition
- Earth Imaging Basics: Spatial Resolution