Hyperspectral imaging helps advance agricultural studies.

Figure 1: For airborne precision agriculture applications, the spectral range most desired is the VNIR, which spans 400 to 1,000 nm. Image: Headwall Photonics Inc.The push to make food and poultry products safer, more wholesome and more plentiful is leading to new initiatives commonly described as “crop science” and “precision agriculture.” Although there are many facets to these initiatives, the ability to “see” the desired field of view with a high degree of spectral and spatial resolution can lead to many scientific breakthroughs that benefit the global community.

In some cases, the desired field of view can be an entire crop field or vineyard as seen from an aircraft or unmanned aerial system (UAS). In other cases, it can simply be crops or poultry moving at rapid speed mere feet away along a conveyor line. Characteristic of both is motion, which allows a sensor (either multispectral or hyperspectral) to capture frames of high-quality spectral image data that can be analyzed later. A complete representation is called a hyperspectral data cube, which is a stack of images of the same object or scene—essentially an image for each wavelength.

The practical difference between multispectral and hyperspectral primarily lies in the number of spectral bands captured. Multispectral imaging captures between five to 30 bands, with gaps between those bands. Hyperspectral captures hundreds, with very dense and continual spectral information for each pixel in the image. In some cases, one will be more desirable to the other. However, a hyperspectral sensor gives the option of “seeing” everything. Spectral signatures are powerful discriminators, and it’s useful to know when a particular chemical fingerprint is there or not. There are many instances where the granularity and specificity of hyperspectral image data is absolutely necessary.

Hyperspectral imaging has the unique ability to extract meaningful scientific information from the scene or field of view. It allows users to detect the presence of a material based on its spectral fingerprint. It also allows users to classify and separate materials into spectrally similar groups. Discrimination, characterization and quantification are also hallmarks of the technology.

The ability of hyperspectral sensors to exhibit a high degree of discrimination means scientists can classify those disease conditions, and also build an image that faithfully pin points where it is and how invasive it might be. Since the image data is GPS-coordinated and orthorectified during post-processing, the scientific value is significant: irrigation and fertilizing decisions are more precise, speciation is more accurate and crops that might be lost are saved. Indeed, the very same hyperspectral imaging technology that can make existing crop harvests more bountiful can also help survey the land in famine-affected areas so crops can be planted with a higher degree of success.

Since motion is needed for hyperspectral imaging to create a data cube, it meshes perfectly with the rapid growth of the UAS across industry, research and academia. The UAS is more affordable, smaller and lighter than fixed-wing manned aircraft and, thus, is more readily deployable in unforgiving areas of the world. Armed with precise instrumentation, such as hyperspectral sensors, a UAS can deliver truly life-enriching information beneficial on a global scale. Entire economies depend on the success of agriculture: citrus in Florida; coffee in South America; vineyards in northern California. Across them all, UAS with hyperspectral sensors are deployed at a rapid pace.

One common mistake prospective users make is misjudging the work needed to integrate the entire flight/sensor package. Obviously, size and weight matter because a UAS, especially the small hand-launched ones, will typically have a strict payload limit. Hyperspectral sensor manufacturers, such as Headwall Photonics, recognize this and are making their instruments small, light and more integrated. For example, Nano-Hyperspec (Figure 1) is only 3 in x 3 in x 4.7 in and weighs less than 1.5 lbs. But, in that small space sits the data storage chip, and the GPS attaches directly to the sensor rather than connected by cables that take up space and add weight. In addition to smaller and more integrated sensors, Headwall also helps to fully integrate the flight package. This means not only the sensor, but also helping with the UAS, the GPS and, if desired, LiDAR instrumentation. LiDAR is a common add-on for precision agriculture work and other remote-sensing research.

Figure 2: Diffraction grating. Image: Headwall Photonics Inc.The value of this integration work is a quicker time to deployment and more success with capturing precise image data. This work also turns a basic UAV or “drone” into a useful UAS.

For airborne precision agriculture applications, the spectral range most desired is the visible/near-infrared (VNIR), which spans 400 to 1,000 nm. Most everything worth seeing can be seen in the VNIR range; but occasionally the shortwave infrared (SWIR), from 900 to 2,500 nm, provides useful data. One option is to simply have two sensors capturing the respective VNIR and SWIR image data, both controlled simultaneously by hyperspectral image software. Another is to have an integrated dual sensor package that co-registers the pixels from each sensor, producing data from 400 nm up to 2,500 nm. While heavier and larger than the Nano, it’s perfectly suitable for bigger UAVs and fixed-wing manned aircraft.

The basic design of a hyperspectral sensor is shown in Figure 2. Reflected light passes through a lens, then through an image slit, which allows a sliver of the scene, image to reach a curved concentric mirror. A high-performance diffraction grating then diffracts the reflected light precisely, and without any unwanted artifacts or aberrations, onto a second concentric mirror, which images the sliver of the scene image onto a camera focal plane array (FPA), but with spectral information spread in the direction perpendicular to the slit direction. By recording this spectral data for that one sliver of the scene and repeating this while moving either the sensor or the scene to capture the spectral data for the next adjacent sliver of the scene image, the hyperspectral image of the full scene can be captured. The spectral range of the sensor is dependent on the FPA technology: silicon (300 to1,100 nm), indium gallium arsenide (InGaAs) (700 to 1,700 nm), indium antimonide (InSb) (1,000 to 5,000 nm) and mercury cadmium telluride (MCT) (1,000 to 1,4000 nm). Hyperspectral sensors of this type are often referred to as line scanners, and the design described above is a concentric one. Because there are no moving parts within the sensor, the resulting instruments are robust and stable across temperature.

Figure 3: Aberration-correction is a very desirable feature made possible through precisely engineered diffraction gratings. Image: Headwall Photonics Inc.Aberration-correction is a very desirable feature made possible through precisely engineered diffraction gratings. The job of the grating is to diffract the light precisely, and this is accomplished by the design of the grating and its groove profile. Gratings can be planar, convex or concave, and the groove profiles are precisely engineered so each grating is application-specific. They can also be engineered in very small sizes, meaning the instruments that contain them are similarly small and light. Figure 3 depicts the basics of a diffraction grating.

The natural tendency is for image degradation to occur off to the edges of the field of view. These aberrations are corrected by the diffraction grating, meaning the entire wide field of view is crisp from one edge to the other. For UAS deployment, this is crucial, because it can actually help optimize the flight efficiency since fewer passes over the field are needed in order to faithfully and sharply capture all the image data the sensor sees.

Hyperspectral imaging isn’t limited to airborne deployment. The technology is also used along inspection lines for high-value specialty crops, such as nuts and fruits, and for poultry and seafood. In fact, the USDA is working with Headwall Photonics to push the science of hyperspectral imaging forward with the goal of improving the inspection accuracy of poultry. The end result is a new technology that delivers higher-quality food products to consumers, while delivering demonstrated differentiation and maximized value to producers.

Thanks to their ability to “see” with a high degree of specificity and discrimination, hyperspectral sensors can spot anything other imaging and vision techniques might miss. The sensors can also “grade” harvested crops in a manner that allows the producer to maximize yields.


The highly-anticipated educational tracks for the 2015 R&D 100 Awards & Technology Conference feature 28 sessions, plus keynote speakers Dean Kamen and Oak Ridge National Laboratory Director Thom Mason.  Learn more.