California is one place where we can very easily see what has changed over a period of days, months, and years. The rapid expansion of agricultural areas as well as hubs of technology and housing make the satellite imagery of this area particularly interesting. Researchers can easily see how California has changed over the years, down to minute details.
Images of California can be used to see how far our ability to take pictures of things from space has come. The Landsat satellites are well known for their contributions to how and how often we can take pictures of the Earth we live on. As with many other technologies, Landsat imagery has improved as we’ve invested more time, money, and study into satellites orbiting the Earth.
Silicon Valley, California, was named after the mineral silicon, used in the valley’s many different technological creations. Landsat 1 caught images of a budding Silicon Valley as a sprawling urban metropolis in October of 1972 using the Multispectral Scanner System (MSS).
Since those first images in 1972, multiple Landsat satellites have been launched with the ability to take increasingly detailed pictures of our world. The latest was Landsat 8, which collected image data using the Operational Land Imager (OLI) in November of 2016.
The two pictures use the same technology and ideas, yet are very different from one another. Landsat 1’s images had to be colored in, while Landsat 8 was able to take full color photos. Spatial resolution has been the greatest improvement in the Landsat technology; the increasing amount of detail visible in the 2016 images compared to the 1972 data is incredible. The MSS image had a spatial resolution of 68 by 83 meters, while the OLI image could decrease this to between 15 and 30 meters.
Compare Imagery from Landsat 1 with Landsat 8
Use the slider on the dual images below to compare Landsat imagery of the Silicon Valley from 1972 with imagery captured in 2016.
The OLI technology is better able to detect differences in brightness and color on the Earth’s surface. While MSS used 64 data values to describe a pixel, OLI uses 4,096. This not only makes the image more detailed, but allows the pixels to blend smoothly together over the whole image.
Images from MSS and OLI also appear differently because of the wavelengths they are designed to detect. MSS creates a false color image because it uses near-infrared wavelengths, which is still helpful for detecting vegetation features in the landscape. OLI doesn’t show the near-infrared wavelength and composes its image using the green, red, and blue spectrum.
More: A Clearer View of Silicon Valley, NASA