Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Do-it-yourself VIS/NIR pushbroom hyperspectral imager with C-mount optics

Open Access Open Access

Abstract

This paper describes a new optomechanical design based on a previously presented do-it-yourself pushbroom hyperspectral imager (HSI) using commercial off-the-shelf (COTS) components. The new design uses larger aperture C-mount at F/2.8 instead of S-mount optics at F/4 to increase the throughput, which allows imaging at lower light levels. This is especially useful for dark surfaces like the deep ocean. The improved throughput is 6.77 higher at the center wavelength of 600 nm, which is shown both by theoretical calculations and experimental data. The measured full width at half maximum (FWHM) at 546.1 nm is 3.69 nm, which is close to the theoretical value of 3.3 nm, and smile and keystone are shown to be reduced in the new design. A method to characterize and remove second order effects using a cut-off filter is also presented and discussed.

© 2022 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. Introduction

Hyperspectral imaging, also known as imaging spectroscopy, can be used in multiple applications such as precision agriculture, ocean color, art conservation, mineral mapping, fish monitoring and for space research [18]. Different materials can be investigated due to their spectral properties, and this is often combined with remote sensing platforms [9]. There is now a rise in the use of miniaturized platforms such as unmanned aerial vehicles (UAVs) and small satellites, which increases the need for small and lightweight instruments.

Compact HSI designs have been developed for decades. Herrala et al. presented an HSI using a prism-grating-prism configuration for industrial purposes in 1994 [10]. In 2000, Sigernes et al. presented another small HSI with a grating-prism configuration [11], which was later used by Volent et al. for remote sensing purposes to map kelp from an aerial platform [12]. Saari et al. presented another miniaturized instrument in 2009 [13], based on the piezoactuated Fabry-Perot interferometer, which has further been used for agriculture applications [2,14]. A general challenge with these small imagers, however, is to achieve high quality and sensitivity without increasing the cost and complexity of the instruments.

High quality imaging and high sensitivity is especially important for Earth observation applications from space, since most of the signal received comes from light scattered by the atmosphere. Payloads such as Hyperspectral Imager for the Coastal Ocean (HICO), Hyperscout, Compact Smartspectral Imager for Monitoring Bio-agricultural Areas (CSIMBA), Compact High Resolution Imaging Spectrometer (CHRIS), Enhanced Light Offner Imaging Spectrometer (ELOIS), and Compact Hyperspectral Instrument for Monitoring the Atmosphere (CHIMA) have different types of small and miniaturized hyperspectral imagers for Earth observation [1517]. These instruments typically have strict requirements, with design and development tailored to the mission, often resulting in complex designs. This, in turn, results in customized parts for the instruments, making them expensive and hard to replicate for use in other projects. There are also commercial instruments available from companies such as Specim, Norsk Elektro Optikk (NEO) and Headwall Photonics, but customization of proprietary instruments is not straightforward.

The goal of this paper is to present a simple design for an HSI with good image quality and relatively high sensitivity so that it can be used for ocean color remote sensing from both air- and spaceborne platforms. The instrument design builds on a previously presented small and lightweight pushbroom HSI, further referred to as the HSI V4, introduced by Sigernes et al. in 2018 [18]. The HSI V4 is already being used in several research projects, such as for smile and spectral tilt correction without an optical laboratory [19], as a drone payload for remote sensing [20], and as a part of a multiple platform set-up with both drones and small satellites [21]. At the Norwegian University of Science and Technology (NTNU), the Hyperspectral Smallsat for Ocean Observations (HYPSO) cubesat mission will use a small HSI to observe ocean color [3]. Observing a relatively dark target such as deep oceans using a hyperspectral instrument from space, however, requires larger throughput than what the HSI V4 could provide. Replicating instruments used in other space missions for the HYPSO mission would require complex modifications. A new instrument design was therefore needed.

This paper presents a new and optimized design for the same type of COTS-based pushbroom HSI with a transmission grating as previously presented for the HSI V4 [18], but with improved throughput and image quality. The new instrument is named the HSI V6. The design and assembly is presented in a do-it-yourself (DIY) way to make it easy for anyone to replicate the design and build their own version of the instrument. The HSI V6 design has already been further optimized and modified for the HYPSO-1 cubesat mission, described by Prentice et al. [22], which also includes information on how to prepare components for the space environment. The basis of the design presented in this report, however, can be useful for both drone and land based applications, and provides more general information on the design. The instrument design, including calculations on the theoretical FWHM and improved throughput, is presented in Sec. 2. Instructions on how to build the imager are briefly given, with detailed instruction further provided in Supplement 1. Optical performance of the instrument is presented in Sec. 3, including calibration and comparison of the measured throughput, smile and keystone in HSI V4 and HSI V6. Test images (hyperspectral datacubes) using the HSI V6 are presented in Sec. 4, showing that the instrument works as expected. In addition, a method using a cut-off filter to characterize and remove second order diffraction effects to increase the usable spectral range is presented and discussed in Sec. 5. Further development and validation of this method can increase the usable spectral range in the instrument, as higher wavelengths no longer will be contaminated with second order effects.

2. Instrument design

The main design goals of both the HSI V4 and HSI V6 is a small instrument covering the spectral range of 400 nm to 800 nm, with a FWHM less than 5 nm. The center wavelength used is $\lambda _c = 600$ nm. The HSI V6 also follows many of the requirements from the HYPSO-1 mission, such as for spatial resolution and signal-to-noise ratio (SNR), as presented in [3]. The main goal of the HSI V6 design is to increase the throughput (compared to the HSI V4 design) to further increase the SNR, which is useful when, for example, observing pythoplankton pigments (see [3] and references therein).

Since the HSI V4 design uses COTS components, the basic optical design can easily be reused for the HSI V6. The overall layout of the optical diagram is not changed and a transmissive grating design is still used. However, the main components such as the objectives, slit and grating are changed to larger components to fit C-mount lens apertures. Higher quality optics also improves the imaging quality, shown by less pixel shifts due to the smile and keystone effects in the HSI V6 (shown in Sec. 3.4).

2.1 Optical design

The optical diagram shown in Fig. 1 illustrates the general idea of the design. The main parts are the front objective ($L_0$), slit ($S$), collimating objective ($L_1$), grating ($G$), detector objective ($L_2$), and finally the detector with the image sensor for capturing the spectrograms.

 figure: Fig. 1.

Fig. 1. Optical diagram of the HSI V6 showing the front objective ($L_0$), slit ($S$), collimating objective ($L_1$), grating ($G$), detector objective ($L_2$), and the sensor with length $X$ in spectral direction. The diameter of the lenses is $D = 18.4$ mm and the diffraction angle is $\beta = 10.37^{\circ}$. $L_0$ and $L_1$ are set to F/2.8, while $L_2$ is set to F/2.

Download Full Size | PDF

2.1.1 Throughput

To achieve higher throughput, the size of the instrument is increased to make room for larger and higher quality optics. The small Edmund Optics (EO) S-mount lens elements used in the HSI V4 are replaced by EO visible (VIS)/near infrared (NIR) C-mount objectives, providing a larger aperture. The focal lengths are $f_0 = f_1 = f_2 = 50$ mm and the back flange focal distances are $17.526$ mm, typical for C-mount objectives. Three equal objectives are used, with $L_0$ and $L_1$ set to F/2.8, while $L_2$ set to F/2 to collect as much as possible of the light after it is dispersed by the grating. Imaging at F/2.8 increases the amount of light collected by the system as the aperture is larger, compared to the HSI V4 design imaging at F/4. The effective aperture is 18.14 mm with an input F/value of 2.8.

The theoretical spectral throughput can be calculated as

$$\mathbf{\Phi}_{\lambda} = B_{\lambda} E_{\lambda} T_{\lambda} G,$$
where $B_{\lambda }$ is the spectral radiance, $E_{\lambda }$ is the grating efficiency at the first spectral order for each wavelength, $T_{\lambda }$ represents the geometric losses and transmission factors of the optical elements, and $G$ is the etendue. The etendue $G$ is further defined as
$$G = \frac{G_A \cos{\alpha}}{f_1^{2}}w' h',$$
where $G_A \cos {\alpha }$ is the illuminated area of the grating, $f_1$ is the focal length of the collimating objective, $w'$ is the slit width magnification and $h'$ is the slit height magnification. The slit width- and height magnifications can be calculated as
$$w' = \frac{f_2}{f_1} \frac{\cos{\alpha}}{\cos{\beta}} w$$
and
$$h'= \frac{f_2}{f_1} h,$$
respectively, where $w$ is the slit width, $h$ is the slit height, $\alpha$ is the incident angle of the incoming light, $\beta$ is the diffraction angle of the grating, and $f_1$ and $f_2$ are the focal lengths of the collimating and detector objectives, respectively [23].

By using the parameters presented in this report and in [18] and assuming the same spectral radiance ($E_{\lambda }$) and geometric losses and transmission factors of the optical elements ($T_{\lambda }$), the theoretical spectral throughput for the center wavelength $\lambda _c = 600$ nm of the HSI V4 and HSI V6 can be compared and gives

$$\frac{\mathbf{\Phi}_{\lambda, V6}}{\mathbf{\Phi}_{\lambda, V4}} = 9.75,$$
meaning that the HSI V6 is expected to have about 10 times more throughput than the original HSI V4 at 600 nm. The HSI V4 used in this report for measuring and comparing optical performance, however, has been modified with a collimator lens with $f_2=25$ mm, instead of $f_2=30$ mm. This gives the theoretical throughput ratio to be 6.77 at 600 nm. Repeating these calculations for different wavelengths shows that the ratio ranges from about 4.6 at 400 nm to about 7.8 at 800 nm (for the HSI V4 used in this report). The values vary mostly due to differences in grating efficiencies of the two gratings.

2.1.2 Grating

The grating is changed from a 600 grooves/mm in the HSI V4 to a 300 grooves/mm transmission grating in the HSI V6, since the 300 grooves/mm grating has an overall higher efficiency. It is blazed at $17.5^{\circ}$ and has above $50\%$ efficiency for the wavelength range of 400 nm to 800 nm, peaking at $75\%$ around 500 nm. The diffraction angle is calculated for the center wavelength $\lambda _c = 600$ nm, as

$$\beta = \arcsin{ \left( \frac{k \lambda_c}{a} \right) } = 10.37^{\circ},$$
where $a=3.33\ \mu$m is the spacing between the grooves in the grating and $k=1$ represents the first spectral order.

2.1.3 Full width at half maximum

The theoretical FWHM is calculated as

$$\text{FWHM} = \frac{w a\cos{ \left( \alpha \right) }}{k f_1} = 3.33 \text{ nm},$$
where $w=50\ \mu$m is the slit width, $\alpha =0^{\circ}$ is the incident angle of the incoming light and $f_1=50$ mm is the focal length of the collimating objective [24,25]. Both changing the grating to one with more grooves/mm and increasing the slit width from $25\ \mu$m to $50\ \mu$m widens the theoretical FWHM. On the other hand, longer focal length on the objectives reduces the FWHM, so that the final theoretical FWHM is 3.33 nm, compared to 1.4 nm for the HSI V4.

Many ocean color phenomena require a spectral resolution of less than $5$ nm [26]. The increase in FWHM was therefore seen as an acceptable trade-off as it is still within the typical requirement for ocean color applications, and since the HSI V6 is primarily developed for observing the ocean (hence the importance of increasing the throughput). The wider slit and higher efficiency in the grating increases the theoretical throughput [3,22], which is the primary goal for the HSI V6 design.

2.1.4 Instantaneous field of view

The instantaneous field of view (IFOV) of the instrument is defined as

$$\text{IFOV} = 2 \arctan{ \left( \frac{w}{2 f_0} \right) } ,$$
where $w$ is the slit width or slit height depending on along-track or cross-track direction, respectively, and $f_0$ the focal length [27]. The IFOV is calculated to be $0.057^{\circ}$ in along-track direction, and $8.008^{\circ}$ in cross-track direction. For example, when used on a satellite platform in an orbit with altitude about 500 km, this gives a spatial resolution of about 60 m, and a swath width to be 70.32 km [3,22].

2.2 Components

A list of the parts used are shown in Table 1. How the parts fit together is shown in an exploded view of the instrument in Fig. 2. Most parts are COTS components from Thorlabs and Edmund Optics. In addition to the COTS components, two 3D printed parts are needed.

 figure: Fig. 2.

Fig. 2. Exploded view showing the components of HSI V6 in CAD. (1) Objective, (2) spacer ring, (3) adapter ring, (4) lens tube, (5) slit, (6) cage plate for lens tube, (7+8+9) steel rods and swivels, (10) mounting bracket, (11) grating holder, (12) grating, (13) cage plate for detector mount, (14) detector mount, (15) detector. Numbers coincide with item numbers in Table 1.

Download Full Size | PDF

Tables Icon

Table 1. HSI V6 detailed list of parts (1 inch = 2.54 cm).

The front objective back focal flange distance is 17.526 mm, any standard C-mount objective may therefore be used as front optics. Using two identical objectives facing the entrance slit makes the use of baffles or field lenses to collimate the light before reaching the grating superfluous, since the optical properties of the objectives are the same.

2.2.1 Choosing the detector

The detector used is an industrial global shutter camera from Imaging Development Systems (iDS) with a Sony IMX174 CMOS image sensor. It is mainly chosen as it has high Quantum Efficiency (QE) in the visible range, and fulfills resolution and SNR requirements for the HYPSO-1 mission [3,22]. It can easily be switched out with a different detector if desired. Among others, detector heads with Sony IMX249, Sony IMX252 and CMOSIS CMV2000 sensors have been tested during the development of the HSI V6. The Sony IMX249 is a less expensive version of the IMX174, with similar optical performance but a lower frame rate. The Sony IMX252 has smaller pixels than IMX174/249, and a smaller sensor area. Both IMX252 and CMV2000 are more sensitive in the NIR wavelengths, at the cost of having lower QE in the VIS range. Other parameters to consider when choosing the detector includes the size of the sensor, number of pixels, pixel size, pixel pitch, well depth and bit depth, among others.

2.2.2 3D printed parts

The parts that need to be 3D printed are the grating holder and the iDS detector mount. Both are constructed to be used with the 30 mm cage system from Thorlabs. The CAD files are available online: http://kho.unis.no/doc/Gratingholder.zip for the grating holder and http://kho.unis.no/doc/IDSinsert.zip for the detector mount. The grating holder consists of two parts that are held together by two 4 mm bolts, spacers and lock nuts, as seen in Fig. 3. How to assemble the grating holder is described in Supplement 1.

 figure: Fig. 3.

Fig. 3. Assembled 3D printed grating holder. All measurements are in units of mm.

Download Full Size | PDF

2.3 Assembly

Before the full instrument is assembled, focus of the objectives must be set. For remote sensing from drones and satellites, focus on all objectives are set to infinity. For other applications imaging at closer distances, the front objective focus may be adjusted accordingly.

The next step is to assemble the slit tube. The slit is placed inside the lens tube and fastened with the retaining rings, so that the slit is located in the center of the tube (note that the slit holder is not symmetrical, the slit is closer to one side than the other). The grating is then mounted into the grating holder. Since it is the most brittle component, it is good practice to first test the assembly using a glass window of the same size as the grating (e.g. EO#46-097), to make sure that the 3D print is not too tight and must be trimmed before continuing.

Finally, the detector is mounted to the 3D printed detector mount. The full instrument is then assembled by adding the front and collimating objectives to the slit tube, sliding it towards the grating holder and fastening it with the cage rails. The imaging objective is fastened to the detector subassembly, slid towards the back of the grating holder and fastened with the cage rails.

The fine focus is adjusted by the use of brass spacer rings to ensure the correct distance from the objectives to the slit. This compensates for small dimensional differences in the COTS components. It can be a cumbersome process adding and removing spacer rings to obtain the optimal focus, but this fine tuning is important as small changes in the distance can give visibly worse results of the spectral and spatial resolutions in the spectrogram.

For more detailed instruction on how to build the instrument, including some helpful images and tips and tricks, see Supplement 1.

3. Optical performance

The optical performance was measured and compared using different calibration and characterization procedures. Spectral and radiometric calibration were performed to achieve known and comparable values. Spectral calibration data was further used to the investigate FWHM, and a second set of radiometric calibration data used to compare the measured throughput in HSI V4 and HSI V6. Finally, smile and keystone were measured for both instruments and compared.

3.1 Calibration

Spectral calibration was done using argon (Newport model 6030) and mercury-argon (Newport model 6035) vapor tubes with known emission lines, shown in Fig. 4. The exact spectral range reaching the sensor, which can be referred to as the full spectral range, differs slightly between different individual models as it depends heavily on the exact placement of the components. The full spectral range can easily be found using the spectral calibration data, and is from below 300 nm to above 900 nm for the IMX174 sensor, as seen in Fig. 4. If using a smaller sensor, such as the IMX252, the full spectral range is shortened and spans from right below 400 nm to about 850 nm. The calibration data shows that there is almost no light recorded below 400 nm, which is mainly due to the coating on the objectives blocking light below 400 nm. The signal above 800 nm is low due to low QE of the sensor, and may be contaminated by second order diffraction effects. The usable spectral range is therefore 400 nm to 800 nm, which is the designed spectral range of the instrument. The full spectral range differs from the designed spectral range simply because the sensor is larger than needed. The full spectral range is shown for the calibration data since the wavelengths above 800 nm are used to investigate the second order diffraction effects in Sec. 5, while the designed spectral range is used for the rest of the data shown in the report.

 figure: Fig. 4.

Fig. 4. HSI V6 calibration data. Blue lines show spectral calibration peaks from the argon and mercury lamps. The green line is the measured spectrum from radiometric calibration, while the black dashed line is the reference spectrum. All recorded values are shown in counts (left axis), while the reference spectrum is given as radiance (right axis). The spectra are sampled from the center horizontal row of the detector.

Download Full Size | PDF

For radiometric calibration, an integrating sphere (Model ISS-30VA, Gigahertz Optik) with a certified tungsten halogen lamp with reference radiance from 400 nm to 2500 nm with a 10 nm resolution was used. The recorded spectrum is shown in green in Fig. 4.

3.2 Full width at half maximum

Spectral resolution, measured as the FWHM, was calculated from the spectral calibration data shown in Fig. 4. The mercury peak at 546.1 nm was found to be 3.69 nm, which is close to the theoretical value of 3.33 nm. Achieving a measured FWHM equal to the theoretical FWHM is hard as the focus of the objectives and all distances between the optical elements must be extremely precise in order to obtain the theoretical value. In practice this is hard to accomplish without spending a large amount of time and resources on precision mechanics. However, values close to the theoretical FWHM are possible to achieve.

3.3 Throughput

The ratio of the throughput in HSI V6 to HSI V4 was also measured experimentally. Both imagers were placed in front of a Lambertian screen (Labsphere SRT-99-180) illuminated by a 1000 W tungsten lamp (ORIEL SN7-1275), with a distance of 1 m between the lamp and the screen. The exposure times were adjusted so that the captured images were nearly overexposed to limit the effects of noise. The signal was then scaled by exposure time to obtain comparable values, as shown in Fig. 5. The measured throughput ratio at 600 nm was found to be 6.77, which is the same as the theoretical value found in Sec. 3.3, and verifies that the throughput is increased in the HSI V6, as expected. The average measured throughput ratio in the spectral range of 400 nm to 800 nm was 6.91.

 figure: Fig. 5.

Fig. 5. Measured radiometric response in HSI V4 and HSI V6 divided exposure time, showing the improved throughput experimentally.

Download Full Size | PDF

3.4 Smile and keystone

The smile effect is an optical distortion which causes the spectral lines to shift row position as a function of slit height (spatial axis), resulting in a curved spectral line in the spectrogram. The keystone effect makes spatial features look skewed by changing the magnification as a function of wavelength. These effects can be measured and corrected simultaneously as done for the HSI V4 in [28] and the modified HSI V6 used in the HYPSO-1 mission in [29].

Low smile and keystone distortions before correction is still desired to ensure good data quality. The higher quality objectives chosen for the HSI V6 design results in less pixel shifts due to both smile and keystone, as seen in Fig. 6. The HSI V4 uses a 3 mm slit which limits the area of the sensor that is illuminated, represented by the shaded gray area in the Fig. 6(a). Both for the full range and for the non-shaded area it can be seen that the pixel shifts due to smile is much smaller for the HSI V6 than the HSI V4. The curve for HSI V6 is slightly tilted, causing more pixel shifts for higher wavelengths (above 600 nm). This spectral tilt is most likely due to the slit not being positioned perfectly straight. Figure 6(b) shows the maximum measured pixel shift due to keystone, which is also reduced in the HSI V6 design. Both spectral tilt, smile and keystone can be corrected in software [19,28,29], to some extent.

 figure: Fig. 6.

Fig. 6. Measured pixel shifts due to smile (a) and keystone (b) in the HSI V4 and HSI V6. In (a), the shaded area corresponds to where no light is recorded when using a 3 mm long slit (as done in the original HSI V4 design) instead of a 7 mm long slit which illuminates the full sensor (as used for the HSI V6).

Download Full Size | PDF

4. Proof of concept

The HSI V6 was used to acquire hyperspectral datacubes at the University Centre in Svalbard (UNIS) for proof of functionality. The exact setup is described in [30]. A Syrp Genie mini motion control system was used to ensure a smooth rotation when capturing spectrograms. The data was recorded using the iDS software uEye Cockpit, and stored as 8-bit .avi video files to minimize the data size. The settings such as exposure time and frame rate were adjusted according to the light conditions, and the scan duration adjusted to obtain about 3400 frames per datacube for a scan of 30$^{\circ}$. Examples on how the hyperspectral image processing pipeline can be implemented can be found in [21]. The reduction from 12-bit data to 8-bit data should be done with care, as it significantly affects the intensity resolution of the data. This is not critical when looking at the spatial performance of the HSI, as done with the created red-green-blue (RGB) images in Fig. 7 and Fig. 10, but affects the spectral signatures as seen in Fig. 8. It is recommended to store data as 12-bit if possible, especially if the data is collected for scientific use.

 figure: Fig. 7.

Fig. 7. Longyearbyen harbor, 26-08-2019. (a) Mobile phone image, for reference. (b) RGB image generated from the hyperspectral datacube, using wavelengths 660 nm, 540 nm and 480 nm. Data from [30].

Download Full Size | PDF

 figure: Fig. 8.

Fig. 8. Point spectra from the Longyearbyen harbor datacube, showing sky with diffuse clouds (blue) and ground in the mountainside covered in grass (green).

Download Full Size | PDF

A hyperspectral datacube of the Longyearbyen harbor was recorded on 26-08-2019. For reference, an image of the harbor was taken simultaneously with a mobile phone (Fig. 7(a)). An RGB image was created from the hyperspectral data using the wavelengths 660 nm, 540 nm and 480 nm for the red, green and blue channels, respectively, as is shown in Fig. 7(b). This shows a close to "real" RGB image as the wavelengths chosen are similar to those seen by the human eye. Details in the harbor are visible in the generated image, showing that the imager is capable of capturing spatial features. The generated RGB images shown in Fig.  7 and Fig. 10 are not gamma corrected, which should be done if comparing the mobile phone reference images with the generated RGB images in more detail.

As the data is hyperspectral, each pixel in the spatial image in Fig. 7(b) holds spectral information. Point spectra of the sky and ground in the Longyearbyen harbor datacube, pixel positions marked in Fig. 7(b), are shown in Fig. 8. The sky spectrum has a high response between 400 nm and 600 nm, while the ground spectrum has a lower response overall except for around 700 nm to 800 nm. The calibrated values (not shown) show the same trends, with even higher response around 700 nm for the ground spectrum. This coincides with the expected values, as the sky spectrum generally peaks around 550 nm and has low values above 700 nm, while the spectral signature of vegetation, which is present in the ground pixel, has a high signal from 700 nm and up.

5. Second order diffraction effects

The full spectral range of both the HSI V4 and the HSI V6 is from about 250 nm to 1000 nm. Due to coating on the objectives and low QE in the detector and grating for lower wavelengths there is very little signal reaching the sensor below 400 nm, and no signal below 380 nm. This is confirmed by the recorded spectra in Fig. 4.

Due to the grating design, second order diffraction effects are expected to appear for the higher wavelengths. For a diffraction grating, the relationship between the overlapping orders of light is linear (as opposed to grism designs which can have a nonlinear relationship) and follows the relation

$$\lambda_{k} = \frac{(k+1)\lambda_{k+1}}{k},$$
where $\lambda _{k}$ is a wavelength at one order lower than $\lambda _{k+1}$ that reach the same areas on the detector [31]. Second order effects are therefore expected to appear from about 760-800 nm and higher in both the HSI V4 and the HSI V6.

To measure the contributions from these second order effects, a cut-off filter with zero transmission below 575 nm was used. Measurements were done with the HSI V6 on 21-10-2019 of the same source without and with the cut-off filter. The recorded spectra can be seen in Fig. 9 as a blue and green line, respectively. Contributions from the second order effects can be seen for the higher wavelengths where the blue and green lines diverge.

 figure: Fig. 9.

Fig. 9. Recorded spectrum with HSI V6 without (blue) and with (green) cut-off filter used to measure second order effects, and the resulting corrected spectrum (black).

Download Full Size | PDF

The difference in the recorded signals and the known efficiency of the filter were used to estimate the efficiency of the second order light, $A_{\lambda }$, for all wavelengths above 780 nm. The second order efficiency was further used to compensate, and thereby correct, the second order diffraction effects by estimating the amount of second order light that would appear based in the incoming light at lower wavelengths and subtracting it. The correction is described by the formula

$$C_{\lambda, corr} = C_{\lambda} - C_{\lambda/2} A_{\lambda},$$
where $C_{\lambda, corr}$ is the number of counts at a first order higher wavelength without any second order diffraction signal, $C_{\lambda }$ represents the number of measured counts at the first order higher wavelength, $C_{\lambda /2}$ is the number of measured counts at the corresponding second order lower wavelength, and $A_{\lambda }$ is the calculated efficiency describing the amount of generated second order effects expected to occur at the higher wavelength. The recorded spectrum after second order correction is also shown in Fig. 9 as a black dashed line. It follows the original signal for lower wavelengths (up to 780 nm), and follows the signal recorded with the cut-off filter (without second order light) for the higher wavelengths (above 780 nm).

An example of captured data which visualizes the effect of the second order light is the hyperspectral datacube showing Adventdalen recorded on 09-09-2019, shown in Fig. 10. Figure 10(a) shows again a reference image taken simultaneously with a mobile phone, and Fig. 10(b) shows a "normal" RGB image made using the wavelengths 660 nm, 540 nm and 480 nm as red, green and blue channels.

 figure: Fig. 10.

Fig. 10. Adventdalen, 09-09-2019. (a) Mobile phone image, for reference. (b)-(e) RGB images generated from the hyperspectral datacube using the wavelengths (b) 660 nm, 540 nm and 480 nm, (c) 780 nm, 540 nm and 480 nm, (d) 860 nm, 540 nm and 480 nm, (e) 860 nm, 540 nm and 480 nm after second order correction. Data from [30].

Download Full Size | PDF

Figure 10(c) shows an RGB image using a red channel closer to NIR, of 780 nm which results in a more teal-looking image. This is as expected since the snow and sky reflects less light closer to NIR, contributing to less light in the red channel in the created RGB image. The vegetation in the mountainside and on the ground appears red since vegetation reflects light at NIR wavelengths, thereby contributing to more signal in the red channel. When changing the red component to a wavelength above 800 nm, however, second order effects become apparent, as seen in Fig. 10(d) where the red channel is set to 860 nm. Red color from the vegetation on ground is expected, but the pink hue seen in the sky and on the snowy mountains are not since the signal should be lower at 860 nm than for 780 nm, thereby creating an even more teal color than in Fig. 10(c). The extra signal in the 860 nm channel is due to the second order light.

The second order corrected image is shown in Fig. 10(e). The color of the sky has changed from pink to teal, which is the expected outcome of the correction. The vegetation on ground is still visible with red color, which is also as expected. The corrected image looks similar to the RGB image created using 780 nm in Fig. 10(c), but has slightly more teal color which is reasonable as they are both in the NIR range. Figure 10(e) gives a visual view on how the correction affects the image, but more work should be done to assess the accuracy of the correction. Further details on the measurements and the correction are described in [30].

Similar methods to measure and remove second order light were presented in [31] and [32], which also used filters to measure the relationship between the first and second orders of light and corrected by subtracting signal from the higher wavelengths. Estimating the second order contributions using only the intensity differences, however, makes the correction sensitive to the incoming light spectra. For the second order diffractions, the light spreads out which both decreases the intensity and increases the FWHM. A significant change in the incoming spectra will, therefore, result in a less accurate correction as the signal from neighboring wavelengths changes and is not captured. A monochromator can be used to capture these changes in both intensity and FWHM for several wavelength positions, and interpolation used to create a more extensive model [33]. This should be considered for future work.

Another method using observational ocean data instead of laboratory data was presented in [34]. There, observational data with underwater features such as coral reefs was used for characterization. Since solar radiation at the higher wavelengths (800 nm and above) are absorbed by the water, coral reef features present at the higher wavelengths are from second order light. This was used to quantify the amount of second order light and correct the data during flight. This method is hard to test in the lab as it uses in-flight data. However, using parts of the dataset to characterize the second order effects makes the importance of the differences in incoming light, without measuring the broadening in FWHM, less significant.

6. Conclusions

In this paper, a new and optimized design for a COTS pushbroom HSI has been presented, together with a description of how to build the instrument in a DIY manner. The volume of the instrument is 220 x 65 x 65 mm$^{3}$ with a mass of about 650 g. It is larger in size than the previously presented HSI V4 [18], but can still fit on drones and cubesat platforms.

A complete parts list is included. The estimated cost of parts for one instrument is about $\notin$2600. The design facilitates changes to the COTS components used, if desired, which makes it easy to tailor the imager to individual needs. One of the most expensive part is the detector with the Sony IMX174 sensor, which can be switched out with a camera head using the Sony IMX249 sensor to make the instrument even cheaper without compromising on optical performance, or with the Sony IMX252 to make the instrument more sensitive to NIR wavelengths.

The larger optics, wider slit, and higher efficiency of the grating increases the total throughput of the imager, making it 7-10 times more sensitive than the HSI V4, depending on which version of HSI V4 is used. Both the theoretical and measured throughput ratio at 600 nm was found to be 6.77 for the instruments used in this report, showing a clear improvement in the HSI V6. The changes of components also results in a wider theoretical FWHM of 3.3 nm, compared to 1.4 nm for the HSI V4. The increase in FWHM is found acceptable as ocean color applications often does not require more than 5 nm spectral resolution. The FWHM measured from spectral calibration data was found to be 3.69 nm at 546.1 nm, which is close to the theoretical value of 3.3 nm. Smile and keystone were also measured and compared for the HSI V4 and HSI V6, and shows that pixel shifts due to both the smile and keystone effects are reduced in the HSI V6. Further reduction of both smile and keystone can be done by software correction [28,29].

A method to measure the efficiency of second order light using a cut-off filter and further remove the unwanted effects was investigated. The results are promising, but further work should include both improving the model to accommodate for changes in the incoming light spectrum as described in [33] and to verify the method and estimate uncertainties of the correction.

Funding

Norges Forskningsråd (223254, 270959).

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Supplemental document

See Supplement 1 for supporting content.

References

1. G. Yang, J. Liu, C. Zhao, Z. Li, Y. Huang, H. Yu, B. Xu, X. Yang, D. Zhu, X. Zhang, R. Zhang, H. Feng, X. Zhao, Z. Li, H. Li, and H. Yang, “Unmanned aerial vehicle remote sensing for field-based crop phenotyping: current status and perspectives,” Front. Plant Sci. 8, 1111 (2017). [CrossRef]  

2. E. Honkavaara, H. Saari, J. Kaivosoja, I. Pölönen, T. Hakala, P. Litkey, J. Mäkynen, and L. Pesonen, “Processing and assessment of spectrometric, stereoscopic imagery collected using a lightweight UAV spectral camera for precision agriculture,” Remote Sens. 5(10), 5006–5039 (2013). [CrossRef]  

3. M. E. Grøtte, R. Birkeland, E. Honoré-Livermore, S. Bakken, J. L. Garrett, E. F. Prentice, F. Sigernes, M. Orlandic, J. T. Gravdahl, and T. A. Johansen, “Ocean color hyperspectral remote sensing with high resolution and low latency - the HYPSO-1 CubeSat mission,” IEEE Trans. Geosci. Remote Sensing 60, 1000619 (2021). [CrossRef]  

4. R. Pillay, J. Y. Hardeberg, and S. George, “Hyperspectral imaging of art: acquisition and calibration workflows,” J. Am. Inst. Conserv. 58(1-2), 3–15 (2019). [CrossRef]  

5. R. Baissa, K. Labbassi, P. Launeau, A. Gaudin, and B. Ouajhain, “Using HySpex SWIR-320m hyperspectral data for the identification and mapping of minerals in hand specimens of carbonate rocks from the Ankloute Formation (Agadir Basin, Western Morocco),” J. Afr. Earth Sci. 61(1), 1–9 (2011). [CrossRef]  

6. K. Heia, A. H. Sivertsen, S. K. Stormo, E. Elvevoll, J. P. Wold, and H. Nilsen, “Detection of nematodes in cod (Gadus morhua) fillets by imaging spectroscopy,” J. Food Sci. 72(1), E011–E015 (2007). [CrossRef]  

7. M. E. Schaepman, S. L. Ustin, A. J. Plaza, T. H. Painter, J. Verrelst, and S. Liang, “Earth system science related imaging spectroscopy–an assessment,” Remote. Sens. Environ. 113, S123–S137 (2009). [CrossRef]  

8. R. N. Clark, R. H. Brown, R. Jaumann, D. P. Cruikshank, R. M. Nelson, B. J. Buratti, T. B. McCord, J. Lunine, K. H. Baines, G. Bellucci, J. P. Bibring, F. Capaccioni, P. Cerroni, A. Coradini, V. Formisano, Y. Langevin, D. L. Matson, V. Mennella, P. D. Nicholson, B. Sicardy, C. Sotin, T. M. Hoefen, J. M. Curchin, G. Hansen, K. Hibbits, and K. D. Matz, “Compositional maps of Saturn’s moon Phoebe from imaging spectroscopy,” Nature 435(7038), 66–69 (2005). [CrossRef]  

9. M. T. Eismann, Hyperspectral Remote Sensing (SPIE Press, Bellingham, Washington DC, 2012).

10. E. Herrala, J. T. Okkonen, T. S. Hyvarinen, M. Aikio, and J. Lammasniemi, “Imaging spectrometer for process industry applications,” in Proc. SPIE 2248, Optical Measurements and Sensors for the Process Industries, (1994).

11. F. Sigernes, D. A. Lorentzen, K. Heia, and T. Svenøe, “Multipurpose spectral imager,” Appl. Opt. 39(18), 3143–3153 (2000). [CrossRef]  

12. Z. Volent, G. Johnsen, and F. Signernes, “Kelp forest mapping by use of airborne hyperspectral imager,” J. Appl. Remote Sens 1(1), 1 (2007). [CrossRef]  

13. H. Saari, V.-V. Aallos, A. Akujärvi, T. Antila, C. Holmlund, U. Kantojärvi, J. Mäkynen, and J. Ollila, “Novel miniaturized hyperspectral sensor for UAV and space applications,” in Proc. SPIE 7474, Sensors, Systems, and Next-Generation Satellites XIII, 74741M, (2009).

14. H. Saari, I. Pellikka, L. Pesonen, S. Tuominen, J. Heikkilä, C. Holmlund, J. Mäkynen, K. Ojala, and T. Antila, “Unmanned Aerial Vehicle (UAV) operated spectral camera system for forest and agriculture applications,” in Proc. SPIE 8174, Remote Sensing for Agriculture, Ecosystems, and Hydrology XIII, 81740H, (2011).

15. R. L. Lucke, M. Corson, N. R. McGlothlin, S. D. Butcher, D. L. Wood, D. R. Korwan, R. R. Li, W. A. Snyder, C. O. Davis, and D. T. Chen, “Hyperspectral Imager for the Coastal Ocean: instrument description and first images,” Appl. Opt. 50(11), 1501–1516 (2011). [CrossRef]  

16. A. Zuccaro Marchi, L. Maresi, and M. Taccola, “Technologies and designs for small optical missions,” in Proc. of SPIE 11180, International Conference on Space Optics – ICSO 2018, 111801Z, (2019).

17. H. Strese and L. Maresi, “Technology developments and status of hyperspectral instruments at the European Space Agency,” in Proc. SPIE 11151, Sensors, Systems, and Next-Generation Satellites XXII, 111510T, (2019).

18. F. Sigernes, M. Syrjäsuo, R. Storvold, J. Fortuna, M. E. Grøtte, and T. A. Johansen, “Do it yourself hyperspectral imager for handheld to airborne operations,” Opt. Express 26(5), 6021–6035 (2018). [CrossRef]  

19. K. A. Riihiaho, M. A. Eskelinen, and I. Pölönen, “A do-it-yourself hyperspectral imager brought to practice with open-source python,” Sensors 21(4), 1072 (2021). [CrossRef]  

20. J. Fortuna and T. A. Johansen, “A Lightweight Payload for Hyperspectral Remote Sensing Using Small UAVs,” in 9th Workshop on Hyperspectral Image and Signal Processing, Evolution in Remote Sensing (WHISPERS), (2018).

21. J. L. Garrett, S. Bakken, E. F. Prentice, D. Langer, F. S. Leira, E. Honoré-Livermore, R. Birkeland, M. E. Grøtte, T. A. Johansen, and M. Orlandi, “Hyperspectral image processing pipelines on multiple platforms for coordinated oceanographic observation,” in 11th Workshop on Hyperspectral Image and Signal Processing, Evolution in Remote Sensing (WHISPERS), (2021).

22. E. F. Prentice, M. E. Grøtte, F. Sigernes, and T. A. Johansen, “Design of a hyperspectral imager using COTS optics for small satellite applications,” in Proc. SPIE 11852, International Conference on Space Optics – ICSO 2020, 1185258, (2021).

23. F. Sigernes, Basic Hyperspectral Imaging (2018).

24. C. Palmer, Diffraction Gratings Handbook (Richardson Grating Laboratory, 2002), 5th ed.

25. J. M. Lerner and A. Thevenon, The optics of spectroscopy, a tutorial v2.0 (Joblin Yvon Instruments SA, Inc., 1998).

26. K. R. Turpie, B. County, T. Bell, S. Barbara, and H. M. Dierssen, “Global Observations of Coastal And Inland Aquatic Habitats,” Tech. Rep. November (2016).

27. R. A. Schowengerdt, Remote Sensing - Models and Methods for Image Processing (Elsevier Inc., 2007), 3rd ed.

28. M. B. Henriksen, J. L. Garrett, E. F. Prentice, A. Stahl, and T. A. Johansen, “Real-time corrections for a low-cost hyperspectral instrument,” in 10th Workshop on Hyperspectral Image and Signal Processing, Evolution in Remote Sensing (WHISPERS), (2019).

29. M. B. Henriksen, E. F. Prentice, T. A. Johansen, and F. Sigernes, “Pre-Launch Calibration of the HYPSO-1 Cubesat Hyperspectral Imager,” in IEEE Aerospace Conference, (2022). [accepted].

30. C. M. van Hazendonk, “Calibration of hyperspectral imager,” (2019) Tech. report [retrieved 24 March 2021], http://kho.unis.no/doc/Lotte_van_Hazendonk.pdf.

31. V. Stanishev, “Correcting second-order contamination in low-resolution spectra,” Astron. Nachr. 328(9), 948–952 (2007). [CrossRef]  

32. W. Lee, H. Lee, and J. W. Hahn, “Correction of spectral deformation by second-order diffraction overlap in a mid-infrared range grating spectrometer using a PbSe array detector,” Infrared Phys. Technol. 67, 327–332 (2014). [CrossRef]  

33. S. I. Bruchkouskaya, G. S. Litvinovich, I. I. Bruchkousky, and L. V. Katkovsky, “Algorithm for second-order diffraction correction in a concave diffraction grating spectrometer,” J. Appl. Spectrosc. 86(4), 671–677 (2019). [CrossRef]  

34. R. R. Li, R. Lucke, D. Korwan, and B. C. Gao, “A technique for removing second-order light effects from hyperspectral imaging data,” IEEE Trans. Geosci. Remote Sensing 50(3), 824–830 (2012). [CrossRef]  

Supplementary Material (1)

NameDescription
Supplement 1       Detailed instructions on how to assemble the HSI V6

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (10)

Fig. 1.
Fig. 1. Optical diagram of the HSI V6 showing the front objective ($L_0$), slit ($S$), collimating objective ($L_1$), grating ($G$), detector objective ($L_2$), and the sensor with length $X$ in spectral direction. The diameter of the lenses is $D = 18.4$ mm and the diffraction angle is $\beta = 10.37^{\circ}$. $L_0$ and $L_1$ are set to F/2.8, while $L_2$ is set to F/2.
Fig. 2.
Fig. 2. Exploded view showing the components of HSI V6 in CAD. (1) Objective, (2) spacer ring, (3) adapter ring, (4) lens tube, (5) slit, (6) cage plate for lens tube, (7+8+9) steel rods and swivels, (10) mounting bracket, (11) grating holder, (12) grating, (13) cage plate for detector mount, (14) detector mount, (15) detector. Numbers coincide with item numbers in Table 1.
Fig. 3.
Fig. 3. Assembled 3D printed grating holder. All measurements are in units of mm.
Fig. 4.
Fig. 4. HSI V6 calibration data. Blue lines show spectral calibration peaks from the argon and mercury lamps. The green line is the measured spectrum from radiometric calibration, while the black dashed line is the reference spectrum. All recorded values are shown in counts (left axis), while the reference spectrum is given as radiance (right axis). The spectra are sampled from the center horizontal row of the detector.
Fig. 5.
Fig. 5. Measured radiometric response in HSI V4 and HSI V6 divided exposure time, showing the improved throughput experimentally.
Fig. 6.
Fig. 6. Measured pixel shifts due to smile (a) and keystone (b) in the HSI V4 and HSI V6. In (a), the shaded area corresponds to where no light is recorded when using a 3 mm long slit (as done in the original HSI V4 design) instead of a 7 mm long slit which illuminates the full sensor (as used for the HSI V6).
Fig. 7.
Fig. 7. Longyearbyen harbor, 26-08-2019. (a) Mobile phone image, for reference. (b) RGB image generated from the hyperspectral datacube, using wavelengths 660 nm, 540 nm and 480 nm. Data from [30].
Fig. 8.
Fig. 8. Point spectra from the Longyearbyen harbor datacube, showing sky with diffuse clouds (blue) and ground in the mountainside covered in grass (green).
Fig. 9.
Fig. 9. Recorded spectrum with HSI V6 without (blue) and with (green) cut-off filter used to measure second order effects, and the resulting corrected spectrum (black).
Fig. 10.
Fig. 10. Adventdalen, 09-09-2019. (a) Mobile phone image, for reference. (b)-(e) RGB images generated from the hyperspectral datacube using the wavelengths (b) 660 nm, 540 nm and 480 nm, (c) 780 nm, 540 nm and 480 nm, (d) 860 nm, 540 nm and 480 nm, (e) 860 nm, 540 nm and 480 nm after second order correction. Data from [30].

Tables (1)

Tables Icon

Table 1. HSI V6 detailed list of parts (1 inch = 2.54 cm).

Equations (10)

Equations on this page are rendered with MathJax. Learn more.

Φ λ = B λ E λ T λ G ,
G = G A cos α f 1 2 w h ,
w = f 2 f 1 cos α cos β w
h = f 2 f 1 h ,
Φ λ , V 6 Φ λ , V 4 = 9.75 ,
β = arcsin ( k λ c a ) = 10.37 ,
FWHM = w a cos ( α ) k f 1 = 3.33  nm ,
IFOV = 2 arctan ( w 2 f 0 ) ,
λ k = ( k + 1 ) λ k + 1 k ,
C λ , c o r r = C λ C λ / 2 A λ ,
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.