Abstract

Underwater spectral imaging is a promising method for mapping, classification and health monitoring of coral reefs and seafloor inhabitants. However, the spectrum of light is distorted during the underwater imaging process due to wavelength-dependent attenuation by the water. This paper presents a model-based method that accurately restores brightness of underwater spectral images captured with narrowband filters. A model is built for narrowband underwater spectral imaging. The model structure is derived from physical principles, representing the absorption, scattering and refraction by water and the optical properties of narrowband filters, lenses and image sensors. The model coefficients are calibrated based on spectral images captured underwater and in air. With the imaging model available, energy loss due to water attenuation is restored for images captured at different underwater distances. An experimental setup is built and experiments are carried out to verify the proposed method. Underwater images captured within an underwater distance of 260 cm are restored and compared with those in air. Results show that the relative restoration error is 3.58% on average for the test images, thus proving the accuracy of the proposed method.

© 2016 Optical Society of America

1. Introduction

Spectral imaging is an important method for high-fidelity color reproduction [1], object recognition and identification [2], health monitoring of plants or organism [3], etc. Although the applications of spectral imaging are mostly found in land, this technique has also proven to be valuable in subaqueous surroundings, e.g, in the detection of red tide [4], classification of algae and corals [5–7], health condition of corals [8, 9], identification of underwater minerals [10]. However, it is challenging to obtain spectral images of underwater objects of high fidelity, mainly due to the wavelength-dependent absorption and scattering by water on the permeating light, which distorts the spectral energy distribution in underwater images as well as causes the images to be dark and hazy [11–13].

Intensive research has been done to remove the effects of water column on images, by shortening the underwater imaging distance [6–8], comparing the measurement with references [10, 14–16] or correcting the spectral distortion based on attenuation models [17–22].

For instance, in [6, 8], the light path in water is set to be so short that the influence of water can be neglected. In some studies, standard Spectralon panel with known reflectance is placed beside the underwater object of interest to estimate the attenuation of water, thus the reflectance of the object can be determined [5, 10, 14, 15]. In the work by Mobley [16], a library is built beforehand, containing the spectral radiance reflected by various objects under different water depth in different water environment. Spectral measurement in the field is then compared with the spectrums in the library such that the one with highest similarity is selected; hence the depth and optical properties of the water are determined.

Others are devoted to modeling of the underwater imaging process, calibration of the optical properties of water (e.g., attenuation and scattering coefficients) and correction of the underwater images. The water attenuation is modeled by the Beer-Lambert law in most works [17–24], but various affecting factors are considered in different models, e.g., the scattering of water [18, 23], vignetting in the image [20, 21, 23], response of the camera [19, 23]. During the calibration of the water attenuation coefficient, spectrometers are commonly used to measure the downwelling radiance from the sunlight or atmosphere or reflected spectrum of gray reference in different underwater depths [17–19, 24]. To simplify the calibration process, underwater images taken in different distances or water depth are also used [20, 21]. With knowledge of the attenuation coefficient, the brightness (or intensity) of underwater images are then corrected according to the Beer-Lambert law or model of the imaging system [17–22].

It’s worth noting that the work in [17–22] mainly focuses on color correction of underwater images captured with 3-channel color cameras, where broadband color filters (e.g., red, green and blue filters) with bandwidth more than 100 nm are commonly used. But to highlight spectral features of the object from background, narrowband filters (typically with a full-width-half-maximum (FWHM) of no more than 10 nm) are desired so that images can be acquired at characteristic wavelengths or specific wavelengths of interest [25, 26, 28].

Therefore in this paper, the feasibility of using narrowband filters for underwater spectral imaging is preliminarily studied and calibration and restoration methods for narrowband imaging are investigated. Modeling of underwater spectral imaging with narrowband filters is discussed by referring to the optical properties of water, narrowband filter, lens and image sensor. Model coefficients (e.g., attenuation coefficient of the water, transmittance of the optical window) are calibrated from narrowband spectral images captured both in air and underwater for the same object. Energy loss in underwater images is then restored based on the imaging model and underwater distance.

The main contribution of this paper lies in the exploration of using narrowband filters for underwater spectral imaging and subsequent calibration and restoration methods. Benefitted from decrement in bandwidth of the filter and parameterization in the model, the relative restoration error less than 5% is achieved.

The paper is structured as follows. Modeling of underwater spectral imaging process is investigated in Section 2. In Section 3, the method for model coefficients calibration and image restoration are presented with detailed algorithms. The experimental setup is described in Section 4, followed by experiments and results in Section 5. Discussion is in Section 6 and the paper is concluded in Section 7.

2. Modeling of narrowband underwater spectral imaging

2.1. Spectral imaging in air

The schematics of the narrowband spectral imaging systems under investigation are depicted in Fig. 1(a) and Fig. 1(b) for imaging in air and underwater, respectively. The imaging system mainly consists of a lens (or lens system), a tunable narrowband filter, and an image sensor (e.g. a CCD sensor or CMOS sensor in a camera). The light emitted or reflected by the object is focused by the lens onto the image sensor, which is placed in the image plane of the lens. The passband of the filter can be tuned, e.g. using a filter wheel, a liquid crystal tunable filter, or an acoustic-optic tunable filter, so that only light within a narrow wavelength range can be transmitted to the image sensor.

 

Fig. 1 Illustration of narrowband spectral imaging in air and underwater. The image is enlarged due to refraction of water at the same object distance. The brightness of the image is reduced due to water attenuation as well as enlargement of the image.

Download Full Size | PPT Slide | PDF

Consider a point P on an object in air that is imaged by the imaging system. Denote the coordinates of point P as P(x,y), the object distance as z, and the focal length of the lens as f; then, the coordinates of the image point P′ (denoted as P′(x′, y′)) and the image distance (denoted as z′) are given by [29]

x=fzfx,y=fzfy,z=fzfz.

Since achromatic lenses are commonly used in spectral imaging systems to reduce the achromatic aberration, the focal length of the lens is considered wavelength-independent in the operational wavelength range.

Let L(x,y,λ) be the radiance emitted or reflected from point P, where λ is the wavelength of light in air; then, the irradiance at P′ (denoted as E(x′,y′,λ,z)) can be represented as [29]

E(x,y,λ,z)=π4(Df)2Cτl(λ)τf(λ)(zfz)2G(z)L(x,y,λ)cos4θ,
where D is the diameter of the aperture of the imaging system (e.g., the diameter of the lens), and τl(λ) and τf (λ) are the transmissivity of the lens and filter, respectively. For simplicity of expression, terms C and G(z) are defined as in Eq. (2), where C is related to the aperture and focus length of the lens and G(z) shows how the irradiance changes with the object distance z and focal length f.

The angle θ is the view angle and cos2 θ is given by

cos2θ=x2+y2z2.

The term cos4 θ represents the natural vignetting of the imaging system, which leads to inhomogeneous irradiance distribution in the image sensor. In this paper, only the paraxial case is considered, e.g., if the size of the object is much smaller than the object distance z, then the angle θ is close to 0, cos θ ≈ 1, and the natural vignetting is neglected. Therefore Eq. (2) is simplified as

E(x,y,λ,z)=CG(z)τl(λ)τf(λ)L(x,y,λ).

The image sensors in digital cameras usually consists of a matrix of pixels, where the irradiance in each pixel is converted into brightness of the image. Suppose light in the image point P′ falls on a pixel in the ith row and jth column of the image sensor. Denote the exposure area of the pixel as Si,j. Hence, the brightness of this pixel (denoted as I(i, j,λc,z)) can be represented as

I(i,j,λc,z)=atSi,jλc+Δλ2λcΔλ1q(λ)E(x,y,λ)dλdxdy=atSi,jλc+Δλ2λcΔλ1q(λ)CG(z)τl(λ)τf(λ)L(x,y,λ)dλdxdy,
where λc is the central wavelength of the filter and the passband of the filter is represented as [λc Δλ1, λc + Δλ2] where Δλ1 and Δλ2 are wavelength intervals. q(λ) is the spectral response of the image sensor. t is the exposure time of the image sensor. a represents the conversion from optical power to brightness of the pixel. Because the passband of the filter is within the response range of the image sensor, the integration is over the passband of the filter.

As a pixel is in the scale of microns, the irradiance is considered uniform within the exposure area Si,j. Integration over space coordinates and wavelength can be separated and Eq. (5) be written in a compact form as

I(i,j,λc,z)=atSi,jdxdyACG(z)λcΔλ1λc+Δλ2q(λ)τl(λ)τf(λ)L(x,y,λ)dλH(x,y,λc),=ACG(z)H(x,y,λc).

The terms A and C depend on the configuration of the camera and the parameters of the lens, respectively. The influence of the object distance is embraced in G(z). The new-defined term H(x,y,λc) shows how the brightness of the image can be changed by tuning the central wavelength of the filter.

2.2. Underwater spectral imaging

While taking spectral images underwater, the imaging system should be sealed waterproof and the underwater object can be imaged through an optical window made of glass or sapphire (see Fig. 1(b)). Not only is the size of the image changed due to the refraction of water, but the spectral radiance is also attenuated by the water.

Referring to Fig. 1(b), for the point P(x,y) on the object, the coordinates of the image point Pw are given by [29]

xw=fzwfx,yw=fzwfy,zw=fzwfzw,
where (xw,yw) are coordinates of point Pw in the image plane and zw is the distance between the lens and image. zw is the equivalent underwater distance(see Fig. 2), defined as
zw=l+lnw+dng,
where l′, l and d are the distances in air, water and glass, respectively. nw and ng are the refractive indices of water and glass, respectively. In this study, as the underwater spectral imaging system operates in a wavelength range of [400 nm,700 nm], the variation in ng and nw with respect to the wavelength is neglected. Since the refractive indices of water and glass are more than 1, it seems the object was brought closer to the lens. As a consequence, the image is enlarged.

 

Fig. 2 Refraction at the interfaces between water, glass and air. Due to refraction of water, the object looks as if it were moved closer. Therefore the equivalent object distance zw is defined.

Download Full Size | PPT Slide | PDF

According to the Beer-Lambert law [31], the radiance emitted or reflected by the underwater object is attenuated exponentially with respect to the underwater distance. In addition, the scattering of light by both the water and the particles in the water should be considered in underwater imaging [13, 23]. In a similar way to Eq. (4), the irradiance in the image point Pw (denoted as Ew(xw,yw,λ,zw)) can be represented as

Ew(xw,yw,λ,zw)=β(λ)eα(λ)lCG(zw)τl(λ)τf(λ)L(x,y,λ)+Eb(λ)eν(λ)l+Es(λ),
where β(λ) represents the transmissivity of the optical window and α(λ) is the attenuation coefficient of the water. The term Eb(λ)e−ν(λ)l represents the influence of scattered light on the background of the image with parameters Eb(λ) and ν(λ). As scattering changes the direction of light propagation, not only the optical power in the image point Pw is reduced (as embraced in the attenuation coefficient α(λ)), but the background of the image is hazed, as in the term Eb(λ)eν(λ)l. Es(λ) represents the influence of stray light, which adds to the hazing of the image as well.

By comparing Eq. (4) and Eq. (9), it can be seen that the irradiance in underwater image is reduced by both attenuation and refraction of water. By attenuation of water, the total light power arriving in the image plane is reduced, while, due to refraction of water, the underwater image is enlarged compared with the image in air at the same object distance. Therefore, the light power per unit area in the image plane is reduced and the underwater images appear darker than those taken in air.

When imaging with a digital camera, suppose the image point Pw is within a pixel with coordinates (iw, jw); then, similar to Eq. (5) and (6), the brightness of the pixel can be expressed as

Iw(iw,jw,λc,zw)=atSi,jλc+Δλ2λcΔλ1q(λ)Ew(xw,yw,λ,zw)dλdxdy=ACG(zw)λcΔλ1λc+Δλ2β(λ)eα(λ)lτl(λ)τf(λ)L(x,y,λ)dλ+AλcΔλ1λc+Δλ2Eb(λ)eν(λ)ldλ+AλcΔλ1λc+Δλ2Es(λ)dλ,
where Iw(iw, jwc,zw) is the pixel brightness.

As narrowband filters are used in imaging, the coefficients α(λ), β(λ), Eb(λ), ν(λ) and Es(λ) are considered constant in the passband of the filter, i.e.,

α(λ)α(λc),β(λ)β(λc),Eb(λ)Eb(λc),ν(λ)ν(λc),Es(λ)Es(λc),
for λ ∈ [λc Δλ1, λc + Δλ2]. Therefore, Eq. (10) can be simplified as
Iw(iw,jw,λc,zw)β(λ)eα(λ)lACG(zw)λcΔλ1λc+Δλ2τl(λ)τf(λ)L(x,y,λ)dλ+AEb(λc)(Δλ1+Δλ2)κ(λc)eν(λc)l+AEs(λc)(Δλ1+Δλ2)γ(λc)=β(λc)eα(λc)lACG(zw)H(x,y,λc)+κ(λc)eν(λc)l+γ(λc),
where coefficient κ(λc) and γ(λc) represent the influence of scattered light and stray light, respectively. Since coefficients α(λ), β(λ), Eb(λ), ν(λ) and Es(λ) vary with wavelength in fact, Eq. (11) only holds if the bandwidth of the filter is infinitely small (i.e., Δλ1 + Δλ2 approaches 0). The approximation error increases with the bandwidth of the filter in general, but also depends on the variation in the coefficients. Therefore it’s important to keep the bandwidth of the filter narrow enough, especially in the wavelengths where the coefficients vary rapidly.

3. Spectral image restoration

By combining Eq. (6) and Eq. (12), connection can be made between the image brightness in air and underwater as

Iw(iw,jw,λc,zw)=G(zw)G(z)β(λc)eα(λc)lk(λc,l)I(i,j,λc,z)+κ(λc)eν(λc)l+γ(λc)b(λc,l).

The term k(λc,l) represents the attenuation by the water and the optical window, and b(λc,l) represents the hazing because of scattering, stray light, etc. The goal of image restoration is to compensate for the energy loss in underwater images due to water and optical window attenuation.

Define a coefficient vector ϕ (λ) = [α(λ),ν(λ),β(λ),κ(λ),γ(λ)] ∈ ℝ5. If all coefficients in ϕ (λ) are known, the energy loss in underwater images can be rectified based on the relationship in Eq. (13). However, in practice, the coefficients vary in different regions of the oceans and may also change with respect to time. Therefore, it is important to calibrate the coefficients in-situ to restore the underwater spectral images accurately.

3.1. Calibration of coefficients

As relationship has been made in Eq. (13), coefficients in ϕ(λ) can be estimated by data-fitting for given air image and underwater images of the same object. However, it is required that the brightness I(i, j,λc,z) and Iw(iw, jwc,zw) should correspond to the same point P in the object, which is difficult to implement in practice if not impossible. For this reason, a calibration object with easily-recognizable pattern is necessary to make correspondence for the same point or same region among air and underwater images. For instance, a matrix of squares displayed in a phone screen (see Fig. 3) is used as the calibration object in experiments later on. The color is uniform in each square for ease of making correspondence between images, but different between squares to improve the accuracy of coefficient estimation.

 

Fig. 3 The experimental setup consists of a mobile phone, a water tank and a narrowband spectral imaging system as shown in (a) and (b). The mobile phone is placed in a waterproof box, acting as a luminous underwater object. The pattern displayed in the screen of the phone consists of 60 color pieces in 6 rows and 10 columns. The filters in the spectral imaging system are tuned by the rotation filter wheel with the wavelength scanning from 420 nm to 700 nm at an interval of 20 nm. The typical transmission of a filter with a central wavelength of around 460 nm [33] is shown in (c).

Download Full Size | PPT Slide | PDF

Suppose we have spectral images of the calibration object in air and underwater. All images are acquired with the central wavelength of the filter λc = λ1. The image in air is captured at an object distance of z = z1. The images underwater are captured at a series of underwater distances l = l1,l2,…,lM (M is the number of distances). The coefficients in ϕ (λ) can be calibrated based on Eq. (13) as follows.

Take brightness of N regions in the air image and the corresponding regions in underwater images (N is the number of regions for calibration), and a set of Eqs. can be built as

{Iw(iw,1,jw,1,λ1,zw,1)=G(zw,1)G(z1)k(λ1,l1)I(i1,j1,λ1,z1)+b(λ1,l1),Iw(iw,2,jw,2,λ1,zw,1)=G(zw,1)G(z1)k(λ1,l1)I(i2,j2,λ1,z1)+b(λ1,l1),Iw(iw,N,jw,N,λ1,zw,1)=G(zw,1)G(z1)k(λ1,l1)I(iN,jN,λ1,z1)+b(λ1,l1).

Writing Eq. (14) in matrix form, we have a compact matrix Eq. as

[Gw(zw,1)G(z1)I(i1,j1,λ1,z1)1Gw(zw,1)G(z1)I(i2,j2,λ1,z1)1Gw(zw,1)G(z1)I(iN,jN,λ1,z1)1]D[k(λ1,l1)b(λ1,l1)]X=[Iw(iw,1,jw,1,λ1,zw,1)Iw(iw,2,jw,2,λ1,zw,1)Iw(iw,N,jw,N,λ1,zw,1)]Y,

The unknown terms k(λ1,l1) and b(λ1,l1) can be estimated by solving Eq. (15) with linear least squares (LLS) method as

X^=(DTD)1DTY,
where X^ is the estimate of unknown X. To make sure that the matrix DT D is invertible, at least two regions of different brightness should be selected, i.e., D has full column rank. By selecting more regions into the Eq. set, the accuracy of estimation can be improved.

To have an estimate of the water attenuation coefficient α(λ), underwater images captured at different distances can be used. According to the definition of k(λ,l) in Eq. (13), a set of Eqs. can be developed for a series of underwater distances as

β(λ1)[eα(λ1)l1eα(λ1)l2eα(λ1)lM]P(α)=[k(λ1,l1)k(λ1,l2)k(λ1,lM)]K.

Unknowns α(λ1) and β(λ1) can be estimated by solving an optimization problem as

α^(λ1),β^(λ1)=argminα*,β*Kβ*P(α*)22J,
where J is the cost function to be minimized by the optimization algorithm. α^(λ1) and β^(λ1) are the estimates of α(λ1) and β(λ1), respectively.

Similarly, another set of Eqs. to estimate the hazing coefficients is formed as

κ(λ1)[eν(λ1)l1eν(λ1)l2eν(λ1)lM]Q(ν)+γ(λ1)=[b(λ1,l1)b(λ1,l2)b(λ1,lM)]B.

Coefficients ν(λ1), κ(λ1) and γ(λ1) are estimated by solving an optimization problem as

ν^(λ1),κ^(λ1),γ^(λ1)=argminv*,κ*,γ*Bγ*κ*Q(ν*)22J,
where J′ is the cost function to be minimized by the optimization algorithm. ν^(λ1), κ^(λ1) and γ^(λ1) are the estimates of ν(λ1), κ(λ1) and γ(λ1), respectively.

By repeating the procedures above for different wavelengths λc = λ1,λ2,⋯,λp (where p is the number of wavelengths to be tuned in the imaging system), the coefficients in ϕ (λ) can be estimated.

3.2. Image restoration

With coefficients estimated, following Eq. (13), the underwater images can be restored as

I˜w(iw,jw,λc,zw)=β^1(λc)eα^(λc)l(Iw(iw,jw,λc,zw)κ^(λc)eν^(λc)lγ^(λc)),
where I˜w(iw,jw,λc,zw) is the image brightness after restoration. By restoration, the energy loss due to water attenuation is compensated, but the size of the image is unchanged.

To evaluate the accuracy of the restoration method, comparison is made between the restored underwater images and images captured in air. The restoration error is defined as

Ie(i,j,λc,z)=|I(i,j,λc,z)G(z)G(zw)I˜w(iw,jw,λc,zw)I^(i,j,λc,z)|,
where Ie(i, j,λc,z) is the restoration error. The term G(z)G(zw) accounts for the change in brightness of underwater images due to change in image size and I^(i,j,λc,z) is the estimated image brightness in air. The relative restoration error is defined as
ε(i,j,λc,z)=Ie(i,j,λc,z)I(i,j,λc,z)×100%.

Therefore the restoration method can be summarized as follows.

Model-based narrowband underwater spectral image restoration method (general description and pseudo code implementation)

  1. Spectral image acquisition

    Capture a spectral image cube for a calibration object in air at a distance of z, with wavelength λ = λ1, λ2,⋯,λp (where p is the number of filters).

    Capture M underwater spectral image cubes for the calibration object at underwater distances l = l1,l2,⋯,lM (M is the number of underwater distances), respectively. Each cube is captured at wavelength λ = λ1, λ2,⋯,λp.

  2. Coefficient calibration

     for λc = λ1, λ2,⋯, λp

      for l = l1, l2,⋯, lM

       Estimate k(λ,l) and b(λ,l) using Eqs. (15)(16).

      end

      Estimate coefficients α(λ), β(λ), ν(λ), κ(λ) and γ(λ) using Eqs. (17)(20).

     end

  3. Image restoration

Image brightness restoration using Eq. (21).

Error evaluation using Eq. (22) and Eq. (23).

4. Experimental setup

To verify the restoration method, an experimental setup is implemented as shown in Fig. 3, which mainly consists of a mobile phone, a water tank and a spectral imaging system.

The mobile phone (Xiaomi 3, Xiaomi, China) is placed in a waterproof glass box, with 60 color pieces displayed on its LCD screen, to act as the underwater object to be imaged. The color pieces are squares in 6 rows and 10 columns. Each column presents a hue of the Munsell color system [32], and each row presents different combinations of value and chroma. As discussed in Section 3.1, the screen pattern is designed for calibration of model coefficients, such that the same color piece can be easily recognized among images captured in air and underwater.

The waterproof box (L35 × W25 × H50 cm) is made of 6 mm-thick quartz glass, placed in a water tank (L300 × W30 × H30 cm) made of 10 mm-thick quartz glass. The water tank is filled with clean tap water.

The spectral imaging system is placed out of the water tank. It mainly consists of an imaging lens, a mirror, a rotation filter wheel with a set of filters and a monochrome CCD camera. The lens is a cemented doublet (GCL-010607, Daheng, China), with a focal length of 150 mm and a diameter of 38.1 mm. The computer-controlled rotation filter wheel is installed with 15 filters, each having a tiny magnet beside. The wheel is driven by a stepper motor with the rotation angle monitored by a Hall sensor. The filters (FB serious, Thorlabs, USA) on the wheel have a FWHM of 10 nm (see Fig. 3(c)), with the central wavelength ranging from 420 nm to 700 nm at an interval of 20 nm. The camera (Lm-165M, Lumenera, USA) has a resolution of 1392 × 1040 pixels and a dynamic range of 66 dB. During image acquisition, the filters are spun onto the light axis of the camera successively, and stay on the axis until the camera is fully exposed.

5. Experiments and results

5.1. Preliminary tests

Preliminary tests are carried out before spectral image acquisition, to evaluate the stability of the phone screen and the linearity of the camera. The mobile phone is switched on for more than 10 minutes before the test. The output light intensity of certain color piece is measured by a fiber spectrometer (FLA5000, Jingfei, China) for 3500 scans at an integral time of 50 ms. During the test, the phone is kept charged with a voltage-stabilized source.

The response of the spectrometer on the color piece (3,7) (i.e., the piece in the 3rd row and 7th column of the pattern) is shown in Fig. 4, where the response at wavelengths of 447 nm, 540 nm, and 660 nm are evaluated. The intensity variation for certain wavelength λ is defined as

Intensity variation=Imax(λ)Imin(λ)Imean(λ),
where Imax(λ), Imin(λ) and Imean(λ) are the maximal, minimal, and average intensity of certain wavelength during the observation, respectively. The intensity variation is 1%, 1.7% and 8.9% for wavelengths of 447 nm, 540 nm, and 660 nm, respectively, indicating a stable light output by the phone screen.

 

Fig. 4 Response of the spectrometer on the color piece (3,7) on the phone screen. The intensity at wavelengths of 447 nm, 540 nm, and 660 nm is recorded for 3500 scans (i.e., 175 s), showing a stable light output by the phone screen.

Download Full Size | PPT Slide | PDF

In the test of the camera linearity, an integrating sphere with a standard light source inside is used as the imaging object. The camera takes images of the exit port of the integrating sphere, with the exposure time ranging from 5 ms to 500 ms at an interval of 5 ms. From these images, the dependence of image brightness on exposure time is analyzed. A linear dependent coefficient of 0.99996 indicates good linearity between image brightness and exposure time of the camera.

5.2. Spectral image acquisition

To avoid overexposed or dark images, a scan of the spectral images of the phone screen is conducted before image acquisition, during which the exposure time ranges from 50 ms to 500 ms, and the appropriate exposure time of each wavelength is selected. Then one spectral image cube is acquired in air at a fixed distance of 340 cm and 26 spectral image cubes are acquired for underwater distances of 10 cm, 20 cm, ⋯, 260 cm, respectively. Each image cube (both in air and underwater) consists of 15 images captured at wavelengths of 420 nm, 440 nm, ⋯, 700 nm, respectively. The image cubes captured at underwater distances of 10 cm, 20 cm, 40 cm, 50 cm, 70 cm, ⋯, 250 cm, 260 cm (i.e., every two of three distances) are used for coefficient calibration and the rest for test.

After image collection, the image brightness is normalized to an exposure time of 50 ms. The time-normalized image brightness In is calculated as

In=50tI0255,
where I0 is the brightness of the raw image before normalization and t is the exposure time in ms. The raw brightness I0 is a unitless integer in the range of [0, 255]. The exposure time t is always more than 50 ms during image acquisition, hence the normalized brightness In is in the range of [0, 1].

The irradiance (in W · m2 · nm1) at the image sensor can be determined if the imaging system is calibrated by standard calibration facility and method, e.g., as described in [27]. However, since the focus of this paper is on the restoration method (i.e., to restore the underwater image such that its brightness is close to that in air), normalized brightness In is used throughout image processing and analysis in the following steps.

5.3. Coefficients calibration

As described in Section 3.1, brightness in air image and underwater images corresponding to the same point or region in the calibration object are required for coefficients calibration. For this purpose, the central 5 × 5 pixels in each color piece image are selected and the brightness in this region is averaged to represent the brightness of each color piece in each image. That is, we have 60 brightness values for each image, each value presenting the brightness of individual color piece in the image.

The coefficients k(λ,l) and b(λ,l) are estimated based on Eq. (15) and Eq. (16). The fitting results are shown in Fig. 5 for the wavelength of 460 nm and 620 nm at an underwater distance of 10 cm, 40 cm and 240 cm. To evaluate the accuracy of estimation, the relative standard deviation (RSD) is defined as

RSD(y^,y)=std(y^y)std(y)×100%.

 

Fig. 5 Linear fitting between the brightness of underwater images and the brightness of air images, for estimation of k(λ,l) and b(λ,l), at wavelengths of 460 nm and 620 nm and underwater distances of 60 cm, 160 cm and 260 cm.

Download Full Size | PPT Slide | PDF

Here std(y) is the standard deviation of y. For the fitting of k(λ,l) and b(λ,l), RSD is 5.5%, 3.2% and 6.0% for underwater distance of 20 cm, 160 cm and 260 cm, respectively, at a wavelength of 460 nm, showing accurate estimation of k(λ,l) and b(λ,l).

With k(λ,l) available, unknown coefficients α(λ) and β(λ) are estimated based on Eq. (17) and (18). The fitting results are shown in Fig. 6 for the wavelengths of 460 nm and 620 nm. The maximal RSD is 8.66% for the entire calibration data set and 9.56% for the test set, which is very close to the calibration set.

 

Fig. 6 Exponential fitting between k(λ,l) and underwater distance l, for the estimation of coefficients α(λ) and β(λ). The RSD of the calibration data set is 3.8% and 3.2% for 460 nm and 620 nm, respectively. For the test distances, RSD is 6.4% for 460 nm and 3.6% for 620 nm.

Download Full Size | PPT Slide | PDF

Likewise, the hazing coefficients ν(λ), κ(λ) and γ(λ) are estimated from b(λ,l) based on Eq. (19) and (20). The fitting results are shown in Fig. 7 for wavelengths of 460 nm and 620 nm as well. From the plots, it can be seen that the fitting curve is very close to the data points, indicating accurate fitting of the coefficients.

 

Fig. 7 Exponential fitting between b(λ,l) and underwater distance l, for the estimation of coefficients ν(λ), κ(λ) and γ(λ). The RSD of the calibration data set is 6.3% and 2.0% for 460 nm and 620 nm, respectively. For the test distances, RSD is 9.5% for 460 nm and 2.9% for 620 nm.

Download Full Size | PPT Slide | PDF

5.4. Image restoration

With all necessary coefficients (i.e., α(λ), β(λ), ν(λ), κ(λ) and γ(λ)) estimated, image restoration is accomplished according to Eq. (21). In Fig. 8(a), the raw underwater images and the images after restoration have been shown for comparison for wavelengths of 460 nm and 620 nm, at underwater distances of 60 cm, 150 cm and 240 cm (i.e., all in the test image set). As the underwater distance increases, the brightness of underwater images decreases due to water attenuation as well as the size of the image gets smaller. It is also clearly visible that the brightness of the images has been improved significantly after restoration.

 

Fig. 8 Comparison between raw underwater images, restored underwater images and images in air. Brightness and size of underwater images decrease with underwater distance. By restoration, the brightness of underwater images is improved significantly.

Download Full Size | PPT Slide | PDF

In order to quantify the accuracy of image restoration, the restoration error Ie and the relative restoration error ε are evaluated as in Eq. (22) and Eq. (23), respectively. For the wavelength of 460 nm, the relative restoration error is 4.90%, 5.73% and 5.66% for the distances of 60 cm, 150 cm and 240 cm, respectively. For the wavelength of 620 nm, the relative error is 4.45% for a underwater distance of 240 nm, which indicates that the energy loss in 620 nm has been compensated quite significantly.

Figure 9 shows the spectra of four color pieces before and after restoration. The raw spectra are collected from underwater images with a underwater distance of 240 cm. Results are also compared with the spectra in air. It can be seen that the spectra after restoration almost overlap those in air. The restoration errors of four pieces are all less than 0.02. The relative restoration error is 3.58% in average for all 60 color pieces, all distances and wavelengths, indicating accurate restoration of the spectral energy.

 

Fig. 9 Comparison among raw underwater spectrum, restored underwater spectrum and spectrum in air, for color pieces (1,5), (2,2) (3,9) and (4,7). The underwater distance is 240 cm. The restored spectra almost overlap the spectra in air, indicating an accurate compensation of spectral energy loss due to water.

Download Full Size | PPT Slide | PDF

To show the restoration results intuitionally, color images are presented in Fig. 10, where the color images are constructed from spectral images as

Ic(x,y)=Σλ=420700I(x,y,λ)Sc(λ)T(λ)Sm(λ)max{Σλ=420700I(x,y,λ)Sc(λ)T(λ)Sm(λ)},c(r,g,b),
where Sc(λ) is the quantum efficiency of a color camera (MER-030-120UC-L, Daheng, China), T(λ) is the filters’ peak transmittance, Sm(λ) is the quantum efficiency of the monochrome camera used for acquiring spectral images (Lm-165M, Lumenera) and Ic(x,y) represents the brightness of each primary color. It can be seen in Fig. 10 that the restored color image is quite close to the one in air. The effect of restoration is clearly visible.

 

Fig. 10 Color images are constructed, based on raw underwater spectral images captured at an underwater distance of 240 cm (a), restored underwater spectral images with image size unchanged (b), restored underwater spectral images with image size changed to the same as air image and image brightness scaled accordingly (c), spectral images captured in air at an object distance of 340 cm (d). The improvement by restoration is clearly visible in color images.

Download Full Size | PPT Slide | PDF

6. Discussion

6.1. Influence of the number of filters on restoration

As per the independence of calibration and restoration between each spectral band for the proposed method, the increasing number of filters without changing the bandwidth of the narrowband filter has no effect on the accuracy of image restoration for each spectral band. In this paper, 15 narrowband filters are preliminarily selected with central wavelengths between 420 nm and 700 nm, just to test and verify the effectiveness of the proposed method in the visible range and show the generality of the method.

But in general, if more filters are used, then spectral radiance can be measured at more wavelengths, which will be helpful, for instance, for high-fidelity color restoration where spectral information is required for the whole visible range. In other cases, if only the spectral radiance at certain wavelengths are of interest, then the central wavelength of the filters can be selected to only cover these wavelengths, thus reducing the complexity of the imaging system.

6.2. Influence of the filter bandwidth on restoration

As in modeling of the underwater spectral imaging system, the deduction from Eq. (10) to Eq. (12) is based on the condition that the bandwidth of filters are so narrow that the coefficients are considered constant within the passband. The modeling error will increase and the accuracy of underwater image restoration will be degraded if broadband filters (e.g. red, green and blue color filters in standard color cameras) are used or if the attenuation coefficient of water varies rapidly with wavelength. On the other hand, narrower bandwidth leads to better modeling accuracy, but meanwhile results in light intensity decrease in each band, thus reducing the signal-to-noise ratio in images. Therefore, it is important to seek balance between the bandwidth of the filter and the light intensity in each band.

6.3. Dependence of the restoration error on the distance and wavelength

As shown in Fig. 11, the relative restoration error ε varies with the underwater distance and wavelength. As the underwater distance increases, the energy loss due to water attenuation gets more severe. The underwater images suffers more from stray light in the environment and noise in the imaging system, which leads to increase in the restoration error.

 

Fig. 11 Dependence of the relative restoration error ε on the underwater distance (a) and wavelength (b). The restoration error is averaged over wavelength for certain distance in (a) and averaged over distance for certain wavelength in (b). The spectrum of the mobile phone screen emitting white light is measured by the fiber spectrometer and depicted in (b), showing abrupt change in the intensity in a wavelength range of [420 nm, 480 nm].

Download Full Size | PPT Slide | PDF

The restoration error also varies with the central wavelength of the narrowband filter (see Fig. 11(b)). The error decreases with wavelength in the range of [420 nm, 540 nm] and then increases in the range of [540 nm, 680 nm]. This may be due to the fact that the spectral radiance emitted from the mobile phone screen changes significantly within the wavelength range of [420 nm, 480 nm], such that the approximation in Eq. (11) (i.e., the coefficients are constant within the passband of the filter) results in larger error for this wavelength range. As the wavelength increases from 540 nm, the attenuation coefficient of water increases as well. Therefore the signal level in underwater images is reduced and the restoration error increases.

6.4. Sensitivity to realistic environmental conditions

Compared with lab conditions, the optical properties of the water change with time and locations in realistic environmental conditions. In order to investigate how the restoration error changes with the error in the calibrated coefficient, a simulation is conducted with random deviation added on the calibrated coefficients α and ν. The maximal amplitude of the deviation changes from 1% to 15% with an interval of 1%, with 500 run for each amplitude. The relative restoration error of the test images are evaluated for each test, with the maximum and mean shown in Fig. 12(a) for a central wavelength of 460 nm. The mean restoration error increases from lower than 6% to about 8% when the deviation increases to 15%, while the maximal restoration error reaches 14% when the deviation is 10%, much faster than the mean error. Therefore monitoring of the optical property of the water is important and re-calibration is necessary if there is significant change in water property.

 

Fig. 12 Sensitivity of the restoration error for images at 460 nm to variation in the optical properties of water (a) and to the error in underwater distance measurement (b).

Download Full Size | PPT Slide | PDF

Measurement error (or noise) in underwater distance also has influence on coefficient calibration and image restoration. Simulation is conducted to investigate how the restoration error changes with the error in the underwater distance measurement. Random noise with maximal amplitude in the range of 1%-15% is introduced in distance measurement in case of both coefficient calibration and image restoration. The restoration error is evaluated, 500 run for each amplitude. The maximal and mean relative restoration errors are shown in Fig. 12(b). It can be seen that both the mean and maximal restoration error increases with the measurement error. The maximal restoration error reaches 9% when the distance measurement error is about 10%.

7. Conclusion and future work

In this paper, a model is built for narrowband underwater spectral imaging, considering the optical properties of water, narrowband filters, lens and the camera. Calibration and the restoration methods are then proposed based on the model. An experimental setup is built in lab to verify the proposed method. The restored images are compared with images captured in air and results show that the relative restoration error is 3.58% in average for the test image group. Thus the accuracy of the restoration method is proved.

Future work will focus on spectral reflectance restoration for underwater objects and field test of the method.

Acknowledgments

The authors would like to thank Qiang Yongfa, Na Di, Li Ruiqi and Chen Yao for their support in the experiments. The work is financially supported by the National High-tech R&D Program of China (863 Program) (No. 2014AA093400), National Natural Science Foundation of China (NO. 11304278) and Open Fund of State Key Laboratory of Satellite Ocean Environment Dynamics (No. SOED1606).

References and links

1. C. Liu, W. Liu W, X Lu, W Chen, and J Yang, “Potential of multispectral imaging for real-time determination of colour change and moisture distribution in carrot slices during hot air dehydration,” Food Chem. 195, 110–116 (2016). [CrossRef]  

2. P. Launeau, A. R. Cruden, and J. L. Bouchez, “Mineral recognition in digital images of rocks: a new approach using multichannel classification,” Can. Mineral. 32, 919 (1994).

3. R. Ismail, O. Mutanga, and U. Bob, “Forest health and vitality: the detection and monitoring of pinus patula trees infected by sirex noctilio using digital multispectral imagery,” Southern Hemisphere Forestry Journal 69, 39–47 (2007). [CrossRef]  

4. Y. H. Ahn and P. Shanmugam, “Detecting the red tide algal blooms from satellite ocean color observations in optically complex Northeast-Asia Coastal waters,” Remote Sens. Environ. 103, 419–437 (2006). [CrossRef]  

5. A. C. R. Gleason, R. P. Reid, and K. J. Voss, “Automated classification of underwater multispectral imagery for coral reef monitoring,” in Proceedings of IEEE/MTS OCEANS’07 (IEEE, 2007), pp. 1–8.

6. M. Mehrubeoglu, M. Y. Teng, and P. V. Zimba, “Resolving mixed algal species in hyperspectral images,” Sensors 14, 1–21 (2013). [CrossRef]  

7. M. Mehrubeoglu, D. K. Smith, S. W. Smith, K. B. Strychar, and L. McLauchlan, “Investigating coral hyperspectral properties across coral species and coral state using hyperspectral imaging,” Proc. SPIE 8870, 88700M (2013). [CrossRef]  

8. D. Zawada, “Image processing of underwater multispectral imagery,” IEEE Journal of Oceanic Engineering 28, 583–594 (2003). [CrossRef]  

9. P. J. Mumby, J. D. Hedley, J. Chisholm, C. Clark, H. Ripley, and J. Jaubert, “The cover of living and dead corals from airborne remote sensing,” Coral Reefs 23, 171–183 (2004). [CrossRef]  

10. S. Aarrestad, “Use of underwater hyperspectral imagery for geological characterization of the seabed,” Masters thesis (Norwegian University of Science and Technology, 2014).

11. P. J. Mumby, C. D. Clark, E. P. Green, and A. J. Edwards, “Benefits of water column correction and contextual editing for mapping coral reefs,” International Journal of Remote Sensing 19, 203–210 (1998). [CrossRef]  

12. H. Holden and E. LeDrew, “Effects of the water column on hyperspectral reflectance of submerged coral reef features,” Bulletin of Marine Science 69, 685–699 (2001).

13. H. Y. Yang, P. Y. Chen, C. C. Huang, Y. Z. Zhuang, and Y. H. Shiau, “Low complexity underwater image enhancement based on dark channel prior,” in Proceedings of International Conference on Innovations in Bio-inspired Computing and Applications, (IEEE, 2011), pp. 17–20.

14. R. Pettersen, G. Johnsen, P. Bruheim, and T. Andreassen, “Development of hyperspectral imaging as a bio-optical taxonomic tool for pigmented marine organisms,” Organisms Diversity & Evolution 14, 237–246 (2014). [CrossRef]  

15. G. Johnsen, Z. Volent, H. Dierssen, R. Pettersen, M. Ardelan, F. Sreide, P. Fearns, M. Ludvigsen, and M. Moline, “Underwater hyperspectral imagery to create biogeochemical maps of seafloor properties,” in Subsea Optics and Imaging, J. Watson and O. Zielinski, eds. (Woodhead Publishing, 2013), pp. 508–535. [CrossRef]  

16. C. D. Mobley, L. K. Sundman, C. O. Davis, J. H. Bowles, T. V. Downes, R. A. Leathers, M. J. Montes, W. P. Bissett, D. D. R. Kohler, R. P. Reid, E. M. Louchard, and A. C. R. Gleason, “Interpretation of hyperspectral remote-sensing imagery by spectrum matching and look-up tables,” Appl. Opt. 44, 3576–3592 (2005). [CrossRef]   [PubMed]  

17. J. Åhlén, E. Bengtsson, and D. Sundgren, “Evaluation of underwater spectral data for colour correction applications,” in Proceedings of the 5th WSEAS International Conference on Circuits, Systems, Electronics, Control & Signal Processing (World Scientific and Engineering Academy and Society, 2006), pp. 321–326.

18. J. Åhlén, “Colour correction of underwater images using spectral data,” Ph.D. thesis (Acta Universitatis Up-saliensis, 2005).

19. D. L. Bongiorno, M. Bryson, and S. B. Williams, “Dynamic spectral-based underwater colour correction,” in Proceedings of IEEE/MTS OCEANS’13 (IEEE2013), pp. 1–9.

20. J. Kaeli, H. Singh, C. Murphy, and C. Kunz, “Improving color correction for underwater image surveys,” in Proceedings of IEEE/MTS OCEANS’11 (IEEE, 2011), pp. 805–810.

21. M. Bryson, M. Johnson-Roberson, O. Pizarro, and S. B. Williams, “Colour-consistent structure-from-motion models using underwater imagery,” in Robotics: Science and Systems VIII, N. Roy, P. Newman, and S. Srinivasa, eds. (MIT, 2013).

22. A. Yamashita, M. Fujii, and T. Kaneko, “Color registration of underwater images for underwater sensing with consideration of light attenuation,” in Proceedings of 2007 IEEE International Conference on Robotics and Automation (IEEE, 2007), pp. 4570–4575.

23. M. Boffety, F. Galland, and A. Allais, “Color image simulation for underwater optics,” Appl. Opt. 51, 5633–5642 (2012). [CrossRef]   [PubMed]  

24. D. R. Mishra, S. Narumalani, D. Rundquist, and M. Lawson, “Characterizing the vertical diffuse attenuation coefficient for downwelling irradiance in coastal waters: Implications for water penetration by high resolution satellite data,” ISPRS Journal of photogrammetry and remote sensing 60, 48–64 (2005). [CrossRef]  

25. M. A. Kara, M. Ennahachi, P. Fockens, F. J. W. ten Kate, and J. H. M. Bergman, “Detection and classification of the mucosal and vascular patterns (mucosal morphology) in Barrett’s esophagus by using narrow band imaging,” Gastrointest Endosc. 64, 155–166 (2006). [CrossRef]   [PubMed]  

26. M. A. Kara, A. Mobammed, F. P. Peters, P. Fockens, F. J. W. ten Kate, and J. H. M. Bergman, “Endoscopic video-autofluorescence imaging followed by narrow band imaging for detecting early neoplasia in Barrett’s esophagus,” Gastrointest Endosc. 64, 176–185 (2006). [CrossRef]   [PubMed]  

27. S. W. Brown, G. P. Eppeldauer, P. George, and R. Keith, “NIST facility for spectral irradiance and radiance responsivity calibrations with uniform sources,” Metrologia 37, 579–583 (2000). [CrossRef]  

28. A. Ibrahim, S. Tominaga, and T. Horiuchi, “Invariant representation for spectral reflectance images and its application,” EURASIP J. Image Vide. 2011, 1–12 (2011). [CrossRef]  

29. X. Li and Z. Ceng, Geometrical Optics, Aberrations and Optical Design, 2nd ed. (Zhejiang University, 2007).

30. S. Lin and L. Zhang, “Determining the radiometric response function from a single grayscale image,” in Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition (IEEE, 2005), pp. 66–73.

31. D. F. Swinehart, “The beer-lambert law,” J. Chem. Educ. 39, 333 (1962). [CrossRef]  

32. A. H. Munsell, “Munsell color system”, http://munsell.com/

33. Thorlabs, “Optical filters”, http://www.thorlabs.com/

References

  • View by:
  • |
  • |
  • |

  1. C. Liu, W. Liu W, X Lu, W Chen, and J Yang, “Potential of multispectral imaging for real-time determination of colour change and moisture distribution in carrot slices during hot air dehydration,” Food Chem. 195, 110–116 (2016).
    [Crossref]
  2. P. Launeau, A. R. Cruden, and J. L. Bouchez, “Mineral recognition in digital images of rocks: a new approach using multichannel classification,” Can. Mineral. 32, 919 (1994).
  3. R. Ismail, O. Mutanga, and U. Bob, “Forest health and vitality: the detection and monitoring of pinus patula trees infected by sirex noctilio using digital multispectral imagery,” Southern Hemisphere Forestry Journal 69, 39–47 (2007).
    [Crossref]
  4. Y. H. Ahn and P. Shanmugam, “Detecting the red tide algal blooms from satellite ocean color observations in optically complex Northeast-Asia Coastal waters,” Remote Sens. Environ. 103, 419–437 (2006).
    [Crossref]
  5. A. C. R. Gleason, R. P. Reid, and K. J. Voss, “Automated classification of underwater multispectral imagery for coral reef monitoring,” in Proceedings of IEEE/MTS OCEANS’07 (IEEE, 2007), pp. 1–8.
  6. M. Mehrubeoglu, M. Y. Teng, and P. V. Zimba, “Resolving mixed algal species in hyperspectral images,” Sensors 14, 1–21 (2013).
    [Crossref]
  7. M. Mehrubeoglu, D. K. Smith, S. W. Smith, K. B. Strychar, and L. McLauchlan, “Investigating coral hyperspectral properties across coral species and coral state using hyperspectral imaging,” Proc. SPIE 8870, 88700M (2013).
    [Crossref]
  8. D. Zawada, “Image processing of underwater multispectral imagery,” IEEE Journal of Oceanic Engineering 28, 583–594 (2003).
    [Crossref]
  9. P. J. Mumby, J. D. Hedley, J. Chisholm, C. Clark, H. Ripley, and J. Jaubert, “The cover of living and dead corals from airborne remote sensing,” Coral Reefs 23, 171–183 (2004).
    [Crossref]
  10. S. Aarrestad, “Use of underwater hyperspectral imagery for geological characterization of the seabed,” Masters thesis (Norwegian University of Science and Technology, 2014).
  11. P. J. Mumby, C. D. Clark, E. P. Green, and A. J. Edwards, “Benefits of water column correction and contextual editing for mapping coral reefs,” International Journal of Remote Sensing 19, 203–210 (1998).
    [Crossref]
  12. H. Holden and E. LeDrew, “Effects of the water column on hyperspectral reflectance of submerged coral reef features,” Bulletin of Marine Science 69, 685–699 (2001).
  13. H. Y. Yang, P. Y. Chen, C. C. Huang, Y. Z. Zhuang, and Y. H. Shiau, “Low complexity underwater image enhancement based on dark channel prior,” in Proceedings of International Conference on Innovations in Bio-inspired Computing and Applications, (IEEE, 2011), pp. 17–20.
  14. R. Pettersen, G. Johnsen, P. Bruheim, and T. Andreassen, “Development of hyperspectral imaging as a bio-optical taxonomic tool for pigmented marine organisms,” Organisms Diversity & Evolution 14, 237–246 (2014).
    [Crossref]
  15. G. Johnsen, Z. Volent, H. Dierssen, R. Pettersen, M. Ardelan, F. Sreide, P. Fearns, M. Ludvigsen, and M. Moline, “Underwater hyperspectral imagery to create biogeochemical maps of seafloor properties,” in Subsea Optics and Imaging, J. Watson and O. Zielinski, eds. (Woodhead Publishing, 2013), pp. 508–535.
    [Crossref]
  16. C. D. Mobley, L. K. Sundman, C. O. Davis, J. H. Bowles, T. V. Downes, R. A. Leathers, M. J. Montes, W. P. Bissett, D. D. R. Kohler, R. P. Reid, E. M. Louchard, and A. C. R. Gleason, “Interpretation of hyperspectral remote-sensing imagery by spectrum matching and look-up tables,” Appl. Opt. 44, 3576–3592 (2005).
    [Crossref] [PubMed]
  17. J. Åhlén, E. Bengtsson, and D. Sundgren, “Evaluation of underwater spectral data for colour correction applications,” in Proceedings of the 5th WSEAS International Conference on Circuits, Systems, Electronics, Control & Signal Processing (World Scientific and Engineering Academy and Society, 2006), pp. 321–326.
  18. J. Åhlén, “Colour correction of underwater images using spectral data,” Ph.D. thesis (Acta Universitatis Up-saliensis, 2005).
  19. D. L. Bongiorno, M. Bryson, and S. B. Williams, “Dynamic spectral-based underwater colour correction,” in Proceedings of IEEE/MTS OCEANS’13 (IEEE2013), pp. 1–9.
  20. J. Kaeli, H. Singh, C. Murphy, and C. Kunz, “Improving color correction for underwater image surveys,” in Proceedings of IEEE/MTS OCEANS’11 (IEEE, 2011), pp. 805–810.
  21. M. Bryson, M. Johnson-Roberson, O. Pizarro, and S. B. Williams, “Colour-consistent structure-from-motion models using underwater imagery,” in Robotics: Science and Systems VIII, N. Roy, P. Newman, and S. Srinivasa, eds. (MIT, 2013).
  22. A. Yamashita, M. Fujii, and T. Kaneko, “Color registration of underwater images for underwater sensing with consideration of light attenuation,” in Proceedings of 2007 IEEE International Conference on Robotics and Automation (IEEE, 2007), pp. 4570–4575.
  23. M. Boffety, F. Galland, and A. Allais, “Color image simulation for underwater optics,” Appl. Opt. 51, 5633–5642 (2012).
    [Crossref] [PubMed]
  24. D. R. Mishra, S. Narumalani, D. Rundquist, and M. Lawson, “Characterizing the vertical diffuse attenuation coefficient for downwelling irradiance in coastal waters: Implications for water penetration by high resolution satellite data,” ISPRS Journal of photogrammetry and remote sensing 60, 48–64 (2005).
    [Crossref]
  25. M. A. Kara, M. Ennahachi, P. Fockens, F. J. W. ten Kate, and J. H. M. Bergman, “Detection and classification of the mucosal and vascular patterns (mucosal morphology) in Barrett’s esophagus by using narrow band imaging,” Gastrointest Endosc. 64, 155–166 (2006).
    [Crossref] [PubMed]
  26. M. A. Kara, A. Mobammed, F. P. Peters, P. Fockens, F. J. W. ten Kate, and J. H. M. Bergman, “Endoscopic video-autofluorescence imaging followed by narrow band imaging for detecting early neoplasia in Barrett’s esophagus,” Gastrointest Endosc. 64, 176–185 (2006).
    [Crossref] [PubMed]
  27. S. W. Brown, G. P. Eppeldauer, P. George, and R. Keith, “NIST facility for spectral irradiance and radiance responsivity calibrations with uniform sources,” Metrologia 37, 579–583 (2000).
    [Crossref]
  28. A. Ibrahim, S. Tominaga, and T. Horiuchi, “Invariant representation for spectral reflectance images and its application,” EURASIP J. Image Vide. 2011, 1–12 (2011).
    [Crossref]
  29. X. Li and Z. Ceng, Geometrical Optics, Aberrations and Optical Design, 2nd ed. (Zhejiang University, 2007).
  30. S. Lin and L. Zhang, “Determining the radiometric response function from a single grayscale image,” in Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition (IEEE, 2005), pp. 66–73.
  31. D. F. Swinehart, “The beer-lambert law,” J. Chem. Educ. 39, 333 (1962).
    [Crossref]
  32. A. H. Munsell, “Munsell color system”, http://munsell.com/
  33. Thorlabs, “Optical filters”, http://www.thorlabs.com/

2016 (1)

C. Liu, W. Liu W, X Lu, W Chen, and J Yang, “Potential of multispectral imaging for real-time determination of colour change and moisture distribution in carrot slices during hot air dehydration,” Food Chem. 195, 110–116 (2016).
[Crossref]

2014 (1)

R. Pettersen, G. Johnsen, P. Bruheim, and T. Andreassen, “Development of hyperspectral imaging as a bio-optical taxonomic tool for pigmented marine organisms,” Organisms Diversity & Evolution 14, 237–246 (2014).
[Crossref]

2013 (2)

M. Mehrubeoglu, M. Y. Teng, and P. V. Zimba, “Resolving mixed algal species in hyperspectral images,” Sensors 14, 1–21 (2013).
[Crossref]

M. Mehrubeoglu, D. K. Smith, S. W. Smith, K. B. Strychar, and L. McLauchlan, “Investigating coral hyperspectral properties across coral species and coral state using hyperspectral imaging,” Proc. SPIE 8870, 88700M (2013).
[Crossref]

2012 (1)

2011 (1)

A. Ibrahim, S. Tominaga, and T. Horiuchi, “Invariant representation for spectral reflectance images and its application,” EURASIP J. Image Vide. 2011, 1–12 (2011).
[Crossref]

2007 (1)

R. Ismail, O. Mutanga, and U. Bob, “Forest health and vitality: the detection and monitoring of pinus patula trees infected by sirex noctilio using digital multispectral imagery,” Southern Hemisphere Forestry Journal 69, 39–47 (2007).
[Crossref]

2006 (3)

Y. H. Ahn and P. Shanmugam, “Detecting the red tide algal blooms from satellite ocean color observations in optically complex Northeast-Asia Coastal waters,” Remote Sens. Environ. 103, 419–437 (2006).
[Crossref]

M. A. Kara, M. Ennahachi, P. Fockens, F. J. W. ten Kate, and J. H. M. Bergman, “Detection and classification of the mucosal and vascular patterns (mucosal morphology) in Barrett’s esophagus by using narrow band imaging,” Gastrointest Endosc. 64, 155–166 (2006).
[Crossref] [PubMed]

M. A. Kara, A. Mobammed, F. P. Peters, P. Fockens, F. J. W. ten Kate, and J. H. M. Bergman, “Endoscopic video-autofluorescence imaging followed by narrow band imaging for detecting early neoplasia in Barrett’s esophagus,” Gastrointest Endosc. 64, 176–185 (2006).
[Crossref] [PubMed]

2005 (2)

D. R. Mishra, S. Narumalani, D. Rundquist, and M. Lawson, “Characterizing the vertical diffuse attenuation coefficient for downwelling irradiance in coastal waters: Implications for water penetration by high resolution satellite data,” ISPRS Journal of photogrammetry and remote sensing 60, 48–64 (2005).
[Crossref]

C. D. Mobley, L. K. Sundman, C. O. Davis, J. H. Bowles, T. V. Downes, R. A. Leathers, M. J. Montes, W. P. Bissett, D. D. R. Kohler, R. P. Reid, E. M. Louchard, and A. C. R. Gleason, “Interpretation of hyperspectral remote-sensing imagery by spectrum matching and look-up tables,” Appl. Opt. 44, 3576–3592 (2005).
[Crossref] [PubMed]

2004 (1)

P. J. Mumby, J. D. Hedley, J. Chisholm, C. Clark, H. Ripley, and J. Jaubert, “The cover of living and dead corals from airborne remote sensing,” Coral Reefs 23, 171–183 (2004).
[Crossref]

2003 (1)

D. Zawada, “Image processing of underwater multispectral imagery,” IEEE Journal of Oceanic Engineering 28, 583–594 (2003).
[Crossref]

2001 (1)

H. Holden and E. LeDrew, “Effects of the water column on hyperspectral reflectance of submerged coral reef features,” Bulletin of Marine Science 69, 685–699 (2001).

2000 (1)

S. W. Brown, G. P. Eppeldauer, P. George, and R. Keith, “NIST facility for spectral irradiance and radiance responsivity calibrations with uniform sources,” Metrologia 37, 579–583 (2000).
[Crossref]

1998 (1)

P. J. Mumby, C. D. Clark, E. P. Green, and A. J. Edwards, “Benefits of water column correction and contextual editing for mapping coral reefs,” International Journal of Remote Sensing 19, 203–210 (1998).
[Crossref]

1994 (1)

P. Launeau, A. R. Cruden, and J. L. Bouchez, “Mineral recognition in digital images of rocks: a new approach using multichannel classification,” Can. Mineral. 32, 919 (1994).

1962 (1)

D. F. Swinehart, “The beer-lambert law,” J. Chem. Educ. 39, 333 (1962).
[Crossref]

Aarrestad, S.

S. Aarrestad, “Use of underwater hyperspectral imagery for geological characterization of the seabed,” Masters thesis (Norwegian University of Science and Technology, 2014).

Åhlén, J.

J. Åhlén, E. Bengtsson, and D. Sundgren, “Evaluation of underwater spectral data for colour correction applications,” in Proceedings of the 5th WSEAS International Conference on Circuits, Systems, Electronics, Control & Signal Processing (World Scientific and Engineering Academy and Society, 2006), pp. 321–326.

J. Åhlén, “Colour correction of underwater images using spectral data,” Ph.D. thesis (Acta Universitatis Up-saliensis, 2005).

Ahn, Y. H.

Y. H. Ahn and P. Shanmugam, “Detecting the red tide algal blooms from satellite ocean color observations in optically complex Northeast-Asia Coastal waters,” Remote Sens. Environ. 103, 419–437 (2006).
[Crossref]

Allais, A.

Andreassen, T.

R. Pettersen, G. Johnsen, P. Bruheim, and T. Andreassen, “Development of hyperspectral imaging as a bio-optical taxonomic tool for pigmented marine organisms,” Organisms Diversity & Evolution 14, 237–246 (2014).
[Crossref]

Ardelan, M.

G. Johnsen, Z. Volent, H. Dierssen, R. Pettersen, M. Ardelan, F. Sreide, P. Fearns, M. Ludvigsen, and M. Moline, “Underwater hyperspectral imagery to create biogeochemical maps of seafloor properties,” in Subsea Optics and Imaging, J. Watson and O. Zielinski, eds. (Woodhead Publishing, 2013), pp. 508–535.
[Crossref]

Bengtsson, E.

J. Åhlén, E. Bengtsson, and D. Sundgren, “Evaluation of underwater spectral data for colour correction applications,” in Proceedings of the 5th WSEAS International Conference on Circuits, Systems, Electronics, Control & Signal Processing (World Scientific and Engineering Academy and Society, 2006), pp. 321–326.

Bergman, J. H. M.

M. A. Kara, M. Ennahachi, P. Fockens, F. J. W. ten Kate, and J. H. M. Bergman, “Detection and classification of the mucosal and vascular patterns (mucosal morphology) in Barrett’s esophagus by using narrow band imaging,” Gastrointest Endosc. 64, 155–166 (2006).
[Crossref] [PubMed]

M. A. Kara, A. Mobammed, F. P. Peters, P. Fockens, F. J. W. ten Kate, and J. H. M. Bergman, “Endoscopic video-autofluorescence imaging followed by narrow band imaging for detecting early neoplasia in Barrett’s esophagus,” Gastrointest Endosc. 64, 176–185 (2006).
[Crossref] [PubMed]

Bissett, W. P.

Bob, U.

R. Ismail, O. Mutanga, and U. Bob, “Forest health and vitality: the detection and monitoring of pinus patula trees infected by sirex noctilio using digital multispectral imagery,” Southern Hemisphere Forestry Journal 69, 39–47 (2007).
[Crossref]

Boffety, M.

Bongiorno, D. L.

D. L. Bongiorno, M. Bryson, and S. B. Williams, “Dynamic spectral-based underwater colour correction,” in Proceedings of IEEE/MTS OCEANS’13 (IEEE2013), pp. 1–9.

Bouchez, J. L.

P. Launeau, A. R. Cruden, and J. L. Bouchez, “Mineral recognition in digital images of rocks: a new approach using multichannel classification,” Can. Mineral. 32, 919 (1994).

Bowles, J. H.

Brown, S. W.

S. W. Brown, G. P. Eppeldauer, P. George, and R. Keith, “NIST facility for spectral irradiance and radiance responsivity calibrations with uniform sources,” Metrologia 37, 579–583 (2000).
[Crossref]

Bruheim, P.

R. Pettersen, G. Johnsen, P. Bruheim, and T. Andreassen, “Development of hyperspectral imaging as a bio-optical taxonomic tool for pigmented marine organisms,” Organisms Diversity & Evolution 14, 237–246 (2014).
[Crossref]

Bryson, M.

D. L. Bongiorno, M. Bryson, and S. B. Williams, “Dynamic spectral-based underwater colour correction,” in Proceedings of IEEE/MTS OCEANS’13 (IEEE2013), pp. 1–9.

M. Bryson, M. Johnson-Roberson, O. Pizarro, and S. B. Williams, “Colour-consistent structure-from-motion models using underwater imagery,” in Robotics: Science and Systems VIII, N. Roy, P. Newman, and S. Srinivasa, eds. (MIT, 2013).

Ceng, Z.

X. Li and Z. Ceng, Geometrical Optics, Aberrations and Optical Design, 2nd ed. (Zhejiang University, 2007).

Chen, P. Y.

H. Y. Yang, P. Y. Chen, C. C. Huang, Y. Z. Zhuang, and Y. H. Shiau, “Low complexity underwater image enhancement based on dark channel prior,” in Proceedings of International Conference on Innovations in Bio-inspired Computing and Applications, (IEEE, 2011), pp. 17–20.

Chen, W

C. Liu, W. Liu W, X Lu, W Chen, and J Yang, “Potential of multispectral imaging for real-time determination of colour change and moisture distribution in carrot slices during hot air dehydration,” Food Chem. 195, 110–116 (2016).
[Crossref]

Chisholm, J.

P. J. Mumby, J. D. Hedley, J. Chisholm, C. Clark, H. Ripley, and J. Jaubert, “The cover of living and dead corals from airborne remote sensing,” Coral Reefs 23, 171–183 (2004).
[Crossref]

Clark, C.

P. J. Mumby, J. D. Hedley, J. Chisholm, C. Clark, H. Ripley, and J. Jaubert, “The cover of living and dead corals from airborne remote sensing,” Coral Reefs 23, 171–183 (2004).
[Crossref]

Clark, C. D.

P. J. Mumby, C. D. Clark, E. P. Green, and A. J. Edwards, “Benefits of water column correction and contextual editing for mapping coral reefs,” International Journal of Remote Sensing 19, 203–210 (1998).
[Crossref]

Cruden, A. R.

P. Launeau, A. R. Cruden, and J. L. Bouchez, “Mineral recognition in digital images of rocks: a new approach using multichannel classification,” Can. Mineral. 32, 919 (1994).

Davis, C. O.

Dierssen, H.

G. Johnsen, Z. Volent, H. Dierssen, R. Pettersen, M. Ardelan, F. Sreide, P. Fearns, M. Ludvigsen, and M. Moline, “Underwater hyperspectral imagery to create biogeochemical maps of seafloor properties,” in Subsea Optics and Imaging, J. Watson and O. Zielinski, eds. (Woodhead Publishing, 2013), pp. 508–535.
[Crossref]

Downes, T. V.

Edwards, A. J.

P. J. Mumby, C. D. Clark, E. P. Green, and A. J. Edwards, “Benefits of water column correction and contextual editing for mapping coral reefs,” International Journal of Remote Sensing 19, 203–210 (1998).
[Crossref]

Ennahachi, M.

M. A. Kara, M. Ennahachi, P. Fockens, F. J. W. ten Kate, and J. H. M. Bergman, “Detection and classification of the mucosal and vascular patterns (mucosal morphology) in Barrett’s esophagus by using narrow band imaging,” Gastrointest Endosc. 64, 155–166 (2006).
[Crossref] [PubMed]

Eppeldauer, G. P.

S. W. Brown, G. P. Eppeldauer, P. George, and R. Keith, “NIST facility for spectral irradiance and radiance responsivity calibrations with uniform sources,” Metrologia 37, 579–583 (2000).
[Crossref]

Fearns, P.

G. Johnsen, Z. Volent, H. Dierssen, R. Pettersen, M. Ardelan, F. Sreide, P. Fearns, M. Ludvigsen, and M. Moline, “Underwater hyperspectral imagery to create biogeochemical maps of seafloor properties,” in Subsea Optics and Imaging, J. Watson and O. Zielinski, eds. (Woodhead Publishing, 2013), pp. 508–535.
[Crossref]

Fockens, P.

M. A. Kara, A. Mobammed, F. P. Peters, P. Fockens, F. J. W. ten Kate, and J. H. M. Bergman, “Endoscopic video-autofluorescence imaging followed by narrow band imaging for detecting early neoplasia in Barrett’s esophagus,” Gastrointest Endosc. 64, 176–185 (2006).
[Crossref] [PubMed]

M. A. Kara, M. Ennahachi, P. Fockens, F. J. W. ten Kate, and J. H. M. Bergman, “Detection and classification of the mucosal and vascular patterns (mucosal morphology) in Barrett’s esophagus by using narrow band imaging,” Gastrointest Endosc. 64, 155–166 (2006).
[Crossref] [PubMed]

Fujii, M.

A. Yamashita, M. Fujii, and T. Kaneko, “Color registration of underwater images for underwater sensing with consideration of light attenuation,” in Proceedings of 2007 IEEE International Conference on Robotics and Automation (IEEE, 2007), pp. 4570–4575.

Galland, F.

George, P.

S. W. Brown, G. P. Eppeldauer, P. George, and R. Keith, “NIST facility for spectral irradiance and radiance responsivity calibrations with uniform sources,” Metrologia 37, 579–583 (2000).
[Crossref]

Gleason, A. C. R.

Green, E. P.

P. J. Mumby, C. D. Clark, E. P. Green, and A. J. Edwards, “Benefits of water column correction and contextual editing for mapping coral reefs,” International Journal of Remote Sensing 19, 203–210 (1998).
[Crossref]

Hedley, J. D.

P. J. Mumby, J. D. Hedley, J. Chisholm, C. Clark, H. Ripley, and J. Jaubert, “The cover of living and dead corals from airborne remote sensing,” Coral Reefs 23, 171–183 (2004).
[Crossref]

Holden, H.

H. Holden and E. LeDrew, “Effects of the water column on hyperspectral reflectance of submerged coral reef features,” Bulletin of Marine Science 69, 685–699 (2001).

Horiuchi, T.

A. Ibrahim, S. Tominaga, and T. Horiuchi, “Invariant representation for spectral reflectance images and its application,” EURASIP J. Image Vide. 2011, 1–12 (2011).
[Crossref]

Huang, C. C.

H. Y. Yang, P. Y. Chen, C. C. Huang, Y. Z. Zhuang, and Y. H. Shiau, “Low complexity underwater image enhancement based on dark channel prior,” in Proceedings of International Conference on Innovations in Bio-inspired Computing and Applications, (IEEE, 2011), pp. 17–20.

Ibrahim, A.

A. Ibrahim, S. Tominaga, and T. Horiuchi, “Invariant representation for spectral reflectance images and its application,” EURASIP J. Image Vide. 2011, 1–12 (2011).
[Crossref]

Ismail, R.

R. Ismail, O. Mutanga, and U. Bob, “Forest health and vitality: the detection and monitoring of pinus patula trees infected by sirex noctilio using digital multispectral imagery,” Southern Hemisphere Forestry Journal 69, 39–47 (2007).
[Crossref]

Jaubert, J.

P. J. Mumby, J. D. Hedley, J. Chisholm, C. Clark, H. Ripley, and J. Jaubert, “The cover of living and dead corals from airborne remote sensing,” Coral Reefs 23, 171–183 (2004).
[Crossref]

Johnsen, G.

R. Pettersen, G. Johnsen, P. Bruheim, and T. Andreassen, “Development of hyperspectral imaging as a bio-optical taxonomic tool for pigmented marine organisms,” Organisms Diversity & Evolution 14, 237–246 (2014).
[Crossref]

G. Johnsen, Z. Volent, H. Dierssen, R. Pettersen, M. Ardelan, F. Sreide, P. Fearns, M. Ludvigsen, and M. Moline, “Underwater hyperspectral imagery to create biogeochemical maps of seafloor properties,” in Subsea Optics and Imaging, J. Watson and O. Zielinski, eds. (Woodhead Publishing, 2013), pp. 508–535.
[Crossref]

Johnson-Roberson, M.

M. Bryson, M. Johnson-Roberson, O. Pizarro, and S. B. Williams, “Colour-consistent structure-from-motion models using underwater imagery,” in Robotics: Science and Systems VIII, N. Roy, P. Newman, and S. Srinivasa, eds. (MIT, 2013).

Kaeli, J.

J. Kaeli, H. Singh, C. Murphy, and C. Kunz, “Improving color correction for underwater image surveys,” in Proceedings of IEEE/MTS OCEANS’11 (IEEE, 2011), pp. 805–810.

Kaneko, T.

A. Yamashita, M. Fujii, and T. Kaneko, “Color registration of underwater images for underwater sensing with consideration of light attenuation,” in Proceedings of 2007 IEEE International Conference on Robotics and Automation (IEEE, 2007), pp. 4570–4575.

Kara, M. A.

M. A. Kara, A. Mobammed, F. P. Peters, P. Fockens, F. J. W. ten Kate, and J. H. M. Bergman, “Endoscopic video-autofluorescence imaging followed by narrow band imaging for detecting early neoplasia in Barrett’s esophagus,” Gastrointest Endosc. 64, 176–185 (2006).
[Crossref] [PubMed]

M. A. Kara, M. Ennahachi, P. Fockens, F. J. W. ten Kate, and J. H. M. Bergman, “Detection and classification of the mucosal and vascular patterns (mucosal morphology) in Barrett’s esophagus by using narrow band imaging,” Gastrointest Endosc. 64, 155–166 (2006).
[Crossref] [PubMed]

Keith, R.

S. W. Brown, G. P. Eppeldauer, P. George, and R. Keith, “NIST facility for spectral irradiance and radiance responsivity calibrations with uniform sources,” Metrologia 37, 579–583 (2000).
[Crossref]

Kohler, D. D. R.

Kunz, C.

J. Kaeli, H. Singh, C. Murphy, and C. Kunz, “Improving color correction for underwater image surveys,” in Proceedings of IEEE/MTS OCEANS’11 (IEEE, 2011), pp. 805–810.

Launeau, P.

P. Launeau, A. R. Cruden, and J. L. Bouchez, “Mineral recognition in digital images of rocks: a new approach using multichannel classification,” Can. Mineral. 32, 919 (1994).

Lawson, M.

D. R. Mishra, S. Narumalani, D. Rundquist, and M. Lawson, “Characterizing the vertical diffuse attenuation coefficient for downwelling irradiance in coastal waters: Implications for water penetration by high resolution satellite data,” ISPRS Journal of photogrammetry and remote sensing 60, 48–64 (2005).
[Crossref]

Leathers, R. A.

LeDrew, E.

H. Holden and E. LeDrew, “Effects of the water column on hyperspectral reflectance of submerged coral reef features,” Bulletin of Marine Science 69, 685–699 (2001).

Li, X.

X. Li and Z. Ceng, Geometrical Optics, Aberrations and Optical Design, 2nd ed. (Zhejiang University, 2007).

Lin, S.

S. Lin and L. Zhang, “Determining the radiometric response function from a single grayscale image,” in Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition (IEEE, 2005), pp. 66–73.

Liu, C.

C. Liu, W. Liu W, X Lu, W Chen, and J Yang, “Potential of multispectral imaging for real-time determination of colour change and moisture distribution in carrot slices during hot air dehydration,” Food Chem. 195, 110–116 (2016).
[Crossref]

Liu W, W.

C. Liu, W. Liu W, X Lu, W Chen, and J Yang, “Potential of multispectral imaging for real-time determination of colour change and moisture distribution in carrot slices during hot air dehydration,” Food Chem. 195, 110–116 (2016).
[Crossref]

Louchard, E. M.

Lu, X

C. Liu, W. Liu W, X Lu, W Chen, and J Yang, “Potential of multispectral imaging for real-time determination of colour change and moisture distribution in carrot slices during hot air dehydration,” Food Chem. 195, 110–116 (2016).
[Crossref]

Ludvigsen, M.

G. Johnsen, Z. Volent, H. Dierssen, R. Pettersen, M. Ardelan, F. Sreide, P. Fearns, M. Ludvigsen, and M. Moline, “Underwater hyperspectral imagery to create biogeochemical maps of seafloor properties,” in Subsea Optics and Imaging, J. Watson and O. Zielinski, eds. (Woodhead Publishing, 2013), pp. 508–535.
[Crossref]

McLauchlan, L.

M. Mehrubeoglu, D. K. Smith, S. W. Smith, K. B. Strychar, and L. McLauchlan, “Investigating coral hyperspectral properties across coral species and coral state using hyperspectral imaging,” Proc. SPIE 8870, 88700M (2013).
[Crossref]

Mehrubeoglu, M.

M. Mehrubeoglu, M. Y. Teng, and P. V. Zimba, “Resolving mixed algal species in hyperspectral images,” Sensors 14, 1–21 (2013).
[Crossref]

M. Mehrubeoglu, D. K. Smith, S. W. Smith, K. B. Strychar, and L. McLauchlan, “Investigating coral hyperspectral properties across coral species and coral state using hyperspectral imaging,” Proc. SPIE 8870, 88700M (2013).
[Crossref]

Mishra, D. R.

D. R. Mishra, S. Narumalani, D. Rundquist, and M. Lawson, “Characterizing the vertical diffuse attenuation coefficient for downwelling irradiance in coastal waters: Implications for water penetration by high resolution satellite data,” ISPRS Journal of photogrammetry and remote sensing 60, 48–64 (2005).
[Crossref]

Mobammed, A.

M. A. Kara, A. Mobammed, F. P. Peters, P. Fockens, F. J. W. ten Kate, and J. H. M. Bergman, “Endoscopic video-autofluorescence imaging followed by narrow band imaging for detecting early neoplasia in Barrett’s esophagus,” Gastrointest Endosc. 64, 176–185 (2006).
[Crossref] [PubMed]

Mobley, C. D.

Moline, M.

G. Johnsen, Z. Volent, H. Dierssen, R. Pettersen, M. Ardelan, F. Sreide, P. Fearns, M. Ludvigsen, and M. Moline, “Underwater hyperspectral imagery to create biogeochemical maps of seafloor properties,” in Subsea Optics and Imaging, J. Watson and O. Zielinski, eds. (Woodhead Publishing, 2013), pp. 508–535.
[Crossref]

Montes, M. J.

Mumby, P. J.

P. J. Mumby, J. D. Hedley, J. Chisholm, C. Clark, H. Ripley, and J. Jaubert, “The cover of living and dead corals from airborne remote sensing,” Coral Reefs 23, 171–183 (2004).
[Crossref]

P. J. Mumby, C. D. Clark, E. P. Green, and A. J. Edwards, “Benefits of water column correction and contextual editing for mapping coral reefs,” International Journal of Remote Sensing 19, 203–210 (1998).
[Crossref]

Murphy, C.

J. Kaeli, H. Singh, C. Murphy, and C. Kunz, “Improving color correction for underwater image surveys,” in Proceedings of IEEE/MTS OCEANS’11 (IEEE, 2011), pp. 805–810.

Mutanga, O.

R. Ismail, O. Mutanga, and U. Bob, “Forest health and vitality: the detection and monitoring of pinus patula trees infected by sirex noctilio using digital multispectral imagery,” Southern Hemisphere Forestry Journal 69, 39–47 (2007).
[Crossref]

Narumalani, S.

D. R. Mishra, S. Narumalani, D. Rundquist, and M. Lawson, “Characterizing the vertical diffuse attenuation coefficient for downwelling irradiance in coastal waters: Implications for water penetration by high resolution satellite data,” ISPRS Journal of photogrammetry and remote sensing 60, 48–64 (2005).
[Crossref]

Peters, F. P.

M. A. Kara, A. Mobammed, F. P. Peters, P. Fockens, F. J. W. ten Kate, and J. H. M. Bergman, “Endoscopic video-autofluorescence imaging followed by narrow band imaging for detecting early neoplasia in Barrett’s esophagus,” Gastrointest Endosc. 64, 176–185 (2006).
[Crossref] [PubMed]

Pettersen, R.

R. Pettersen, G. Johnsen, P. Bruheim, and T. Andreassen, “Development of hyperspectral imaging as a bio-optical taxonomic tool for pigmented marine organisms,” Organisms Diversity & Evolution 14, 237–246 (2014).
[Crossref]

G. Johnsen, Z. Volent, H. Dierssen, R. Pettersen, M. Ardelan, F. Sreide, P. Fearns, M. Ludvigsen, and M. Moline, “Underwater hyperspectral imagery to create biogeochemical maps of seafloor properties,” in Subsea Optics and Imaging, J. Watson and O. Zielinski, eds. (Woodhead Publishing, 2013), pp. 508–535.
[Crossref]

Pizarro, O.

M. Bryson, M. Johnson-Roberson, O. Pizarro, and S. B. Williams, “Colour-consistent structure-from-motion models using underwater imagery,” in Robotics: Science and Systems VIII, N. Roy, P. Newman, and S. Srinivasa, eds. (MIT, 2013).

Reid, R. P.

Ripley, H.

P. J. Mumby, J. D. Hedley, J. Chisholm, C. Clark, H. Ripley, and J. Jaubert, “The cover of living and dead corals from airborne remote sensing,” Coral Reefs 23, 171–183 (2004).
[Crossref]

Rundquist, D.

D. R. Mishra, S. Narumalani, D. Rundquist, and M. Lawson, “Characterizing the vertical diffuse attenuation coefficient for downwelling irradiance in coastal waters: Implications for water penetration by high resolution satellite data,” ISPRS Journal of photogrammetry and remote sensing 60, 48–64 (2005).
[Crossref]

Shanmugam, P.

Y. H. Ahn and P. Shanmugam, “Detecting the red tide algal blooms from satellite ocean color observations in optically complex Northeast-Asia Coastal waters,” Remote Sens. Environ. 103, 419–437 (2006).
[Crossref]

Shiau, Y. H.

H. Y. Yang, P. Y. Chen, C. C. Huang, Y. Z. Zhuang, and Y. H. Shiau, “Low complexity underwater image enhancement based on dark channel prior,” in Proceedings of International Conference on Innovations in Bio-inspired Computing and Applications, (IEEE, 2011), pp. 17–20.

Singh, H.

J. Kaeli, H. Singh, C. Murphy, and C. Kunz, “Improving color correction for underwater image surveys,” in Proceedings of IEEE/MTS OCEANS’11 (IEEE, 2011), pp. 805–810.

Smith, D. K.

M. Mehrubeoglu, D. K. Smith, S. W. Smith, K. B. Strychar, and L. McLauchlan, “Investigating coral hyperspectral properties across coral species and coral state using hyperspectral imaging,” Proc. SPIE 8870, 88700M (2013).
[Crossref]

Smith, S. W.

M. Mehrubeoglu, D. K. Smith, S. W. Smith, K. B. Strychar, and L. McLauchlan, “Investigating coral hyperspectral properties across coral species and coral state using hyperspectral imaging,” Proc. SPIE 8870, 88700M (2013).
[Crossref]

Sreide, F.

G. Johnsen, Z. Volent, H. Dierssen, R. Pettersen, M. Ardelan, F. Sreide, P. Fearns, M. Ludvigsen, and M. Moline, “Underwater hyperspectral imagery to create biogeochemical maps of seafloor properties,” in Subsea Optics and Imaging, J. Watson and O. Zielinski, eds. (Woodhead Publishing, 2013), pp. 508–535.
[Crossref]

Strychar, K. B.

M. Mehrubeoglu, D. K. Smith, S. W. Smith, K. B. Strychar, and L. McLauchlan, “Investigating coral hyperspectral properties across coral species and coral state using hyperspectral imaging,” Proc. SPIE 8870, 88700M (2013).
[Crossref]

Sundgren, D.

J. Åhlén, E. Bengtsson, and D. Sundgren, “Evaluation of underwater spectral data for colour correction applications,” in Proceedings of the 5th WSEAS International Conference on Circuits, Systems, Electronics, Control & Signal Processing (World Scientific and Engineering Academy and Society, 2006), pp. 321–326.

Sundman, L. K.

Swinehart, D. F.

D. F. Swinehart, “The beer-lambert law,” J. Chem. Educ. 39, 333 (1962).
[Crossref]

ten Kate, F. J. W.

M. A. Kara, M. Ennahachi, P. Fockens, F. J. W. ten Kate, and J. H. M. Bergman, “Detection and classification of the mucosal and vascular patterns (mucosal morphology) in Barrett’s esophagus by using narrow band imaging,” Gastrointest Endosc. 64, 155–166 (2006).
[Crossref] [PubMed]

M. A. Kara, A. Mobammed, F. P. Peters, P. Fockens, F. J. W. ten Kate, and J. H. M. Bergman, “Endoscopic video-autofluorescence imaging followed by narrow band imaging for detecting early neoplasia in Barrett’s esophagus,” Gastrointest Endosc. 64, 176–185 (2006).
[Crossref] [PubMed]

Teng, M. Y.

M. Mehrubeoglu, M. Y. Teng, and P. V. Zimba, “Resolving mixed algal species in hyperspectral images,” Sensors 14, 1–21 (2013).
[Crossref]

Tominaga, S.

A. Ibrahim, S. Tominaga, and T. Horiuchi, “Invariant representation for spectral reflectance images and its application,” EURASIP J. Image Vide. 2011, 1–12 (2011).
[Crossref]

Volent, Z.

G. Johnsen, Z. Volent, H. Dierssen, R. Pettersen, M. Ardelan, F. Sreide, P. Fearns, M. Ludvigsen, and M. Moline, “Underwater hyperspectral imagery to create biogeochemical maps of seafloor properties,” in Subsea Optics and Imaging, J. Watson and O. Zielinski, eds. (Woodhead Publishing, 2013), pp. 508–535.
[Crossref]

Voss, K. J.

A. C. R. Gleason, R. P. Reid, and K. J. Voss, “Automated classification of underwater multispectral imagery for coral reef monitoring,” in Proceedings of IEEE/MTS OCEANS’07 (IEEE, 2007), pp. 1–8.

Williams, S. B.

D. L. Bongiorno, M. Bryson, and S. B. Williams, “Dynamic spectral-based underwater colour correction,” in Proceedings of IEEE/MTS OCEANS’13 (IEEE2013), pp. 1–9.

M. Bryson, M. Johnson-Roberson, O. Pizarro, and S. B. Williams, “Colour-consistent structure-from-motion models using underwater imagery,” in Robotics: Science and Systems VIII, N. Roy, P. Newman, and S. Srinivasa, eds. (MIT, 2013).

Yamashita, A.

A. Yamashita, M. Fujii, and T. Kaneko, “Color registration of underwater images for underwater sensing with consideration of light attenuation,” in Proceedings of 2007 IEEE International Conference on Robotics and Automation (IEEE, 2007), pp. 4570–4575.

Yang, H. Y.

H. Y. Yang, P. Y. Chen, C. C. Huang, Y. Z. Zhuang, and Y. H. Shiau, “Low complexity underwater image enhancement based on dark channel prior,” in Proceedings of International Conference on Innovations in Bio-inspired Computing and Applications, (IEEE, 2011), pp. 17–20.

Yang, J

C. Liu, W. Liu W, X Lu, W Chen, and J Yang, “Potential of multispectral imaging for real-time determination of colour change and moisture distribution in carrot slices during hot air dehydration,” Food Chem. 195, 110–116 (2016).
[Crossref]

Zawada, D.

D. Zawada, “Image processing of underwater multispectral imagery,” IEEE Journal of Oceanic Engineering 28, 583–594 (2003).
[Crossref]

Zhang, L.

S. Lin and L. Zhang, “Determining the radiometric response function from a single grayscale image,” in Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition (IEEE, 2005), pp. 66–73.

Zhuang, Y. Z.

H. Y. Yang, P. Y. Chen, C. C. Huang, Y. Z. Zhuang, and Y. H. Shiau, “Low complexity underwater image enhancement based on dark channel prior,” in Proceedings of International Conference on Innovations in Bio-inspired Computing and Applications, (IEEE, 2011), pp. 17–20.

Zimba, P. V.

M. Mehrubeoglu, M. Y. Teng, and P. V. Zimba, “Resolving mixed algal species in hyperspectral images,” Sensors 14, 1–21 (2013).
[Crossref]

Appl. Opt. (2)

Bulletin of Marine Science (1)

H. Holden and E. LeDrew, “Effects of the water column on hyperspectral reflectance of submerged coral reef features,” Bulletin of Marine Science 69, 685–699 (2001).

Can. Mineral. (1)

P. Launeau, A. R. Cruden, and J. L. Bouchez, “Mineral recognition in digital images of rocks: a new approach using multichannel classification,” Can. Mineral. 32, 919 (1994).

Coral Reefs (1)

P. J. Mumby, J. D. Hedley, J. Chisholm, C. Clark, H. Ripley, and J. Jaubert, “The cover of living and dead corals from airborne remote sensing,” Coral Reefs 23, 171–183 (2004).
[Crossref]

EURASIP J. Image Vide. (1)

A. Ibrahim, S. Tominaga, and T. Horiuchi, “Invariant representation for spectral reflectance images and its application,” EURASIP J. Image Vide. 2011, 1–12 (2011).
[Crossref]

Food Chem. (1)

C. Liu, W. Liu W, X Lu, W Chen, and J Yang, “Potential of multispectral imaging for real-time determination of colour change and moisture distribution in carrot slices during hot air dehydration,” Food Chem. 195, 110–116 (2016).
[Crossref]

Gastrointest Endosc. (2)

M. A. Kara, M. Ennahachi, P. Fockens, F. J. W. ten Kate, and J. H. M. Bergman, “Detection and classification of the mucosal and vascular patterns (mucosal morphology) in Barrett’s esophagus by using narrow band imaging,” Gastrointest Endosc. 64, 155–166 (2006).
[Crossref] [PubMed]

M. A. Kara, A. Mobammed, F. P. Peters, P. Fockens, F. J. W. ten Kate, and J. H. M. Bergman, “Endoscopic video-autofluorescence imaging followed by narrow band imaging for detecting early neoplasia in Barrett’s esophagus,” Gastrointest Endosc. 64, 176–185 (2006).
[Crossref] [PubMed]

IEEE Journal of Oceanic Engineering (1)

D. Zawada, “Image processing of underwater multispectral imagery,” IEEE Journal of Oceanic Engineering 28, 583–594 (2003).
[Crossref]

International Journal of Remote Sensing (1)

P. J. Mumby, C. D. Clark, E. P. Green, and A. J. Edwards, “Benefits of water column correction and contextual editing for mapping coral reefs,” International Journal of Remote Sensing 19, 203–210 (1998).
[Crossref]

ISPRS Journal of photogrammetry and remote sensing (1)

D. R. Mishra, S. Narumalani, D. Rundquist, and M. Lawson, “Characterizing the vertical diffuse attenuation coefficient for downwelling irradiance in coastal waters: Implications for water penetration by high resolution satellite data,” ISPRS Journal of photogrammetry and remote sensing 60, 48–64 (2005).
[Crossref]

J. Chem. Educ. (1)

D. F. Swinehart, “The beer-lambert law,” J. Chem. Educ. 39, 333 (1962).
[Crossref]

Metrologia (1)

S. W. Brown, G. P. Eppeldauer, P. George, and R. Keith, “NIST facility for spectral irradiance and radiance responsivity calibrations with uniform sources,” Metrologia 37, 579–583 (2000).
[Crossref]

Organisms Diversity & Evolution (1)

R. Pettersen, G. Johnsen, P. Bruheim, and T. Andreassen, “Development of hyperspectral imaging as a bio-optical taxonomic tool for pigmented marine organisms,” Organisms Diversity & Evolution 14, 237–246 (2014).
[Crossref]

Proc. SPIE (1)

M. Mehrubeoglu, D. K. Smith, S. W. Smith, K. B. Strychar, and L. McLauchlan, “Investigating coral hyperspectral properties across coral species and coral state using hyperspectral imaging,” Proc. SPIE 8870, 88700M (2013).
[Crossref]

Remote Sens. Environ. (1)

Y. H. Ahn and P. Shanmugam, “Detecting the red tide algal blooms from satellite ocean color observations in optically complex Northeast-Asia Coastal waters,” Remote Sens. Environ. 103, 419–437 (2006).
[Crossref]

Sensors (1)

M. Mehrubeoglu, M. Y. Teng, and P. V. Zimba, “Resolving mixed algal species in hyperspectral images,” Sensors 14, 1–21 (2013).
[Crossref]

Southern Hemisphere Forestry Journal (1)

R. Ismail, O. Mutanga, and U. Bob, “Forest health and vitality: the detection and monitoring of pinus patula trees infected by sirex noctilio using digital multispectral imagery,” Southern Hemisphere Forestry Journal 69, 39–47 (2007).
[Crossref]

Other (14)

A. C. R. Gleason, R. P. Reid, and K. J. Voss, “Automated classification of underwater multispectral imagery for coral reef monitoring,” in Proceedings of IEEE/MTS OCEANS’07 (IEEE, 2007), pp. 1–8.

S. Aarrestad, “Use of underwater hyperspectral imagery for geological characterization of the seabed,” Masters thesis (Norwegian University of Science and Technology, 2014).

H. Y. Yang, P. Y. Chen, C. C. Huang, Y. Z. Zhuang, and Y. H. Shiau, “Low complexity underwater image enhancement based on dark channel prior,” in Proceedings of International Conference on Innovations in Bio-inspired Computing and Applications, (IEEE, 2011), pp. 17–20.

J. Åhlén, E. Bengtsson, and D. Sundgren, “Evaluation of underwater spectral data for colour correction applications,” in Proceedings of the 5th WSEAS International Conference on Circuits, Systems, Electronics, Control & Signal Processing (World Scientific and Engineering Academy and Society, 2006), pp. 321–326.

J. Åhlén, “Colour correction of underwater images using spectral data,” Ph.D. thesis (Acta Universitatis Up-saliensis, 2005).

D. L. Bongiorno, M. Bryson, and S. B. Williams, “Dynamic spectral-based underwater colour correction,” in Proceedings of IEEE/MTS OCEANS’13 (IEEE2013), pp. 1–9.

J. Kaeli, H. Singh, C. Murphy, and C. Kunz, “Improving color correction for underwater image surveys,” in Proceedings of IEEE/MTS OCEANS’11 (IEEE, 2011), pp. 805–810.

M. Bryson, M. Johnson-Roberson, O. Pizarro, and S. B. Williams, “Colour-consistent structure-from-motion models using underwater imagery,” in Robotics: Science and Systems VIII, N. Roy, P. Newman, and S. Srinivasa, eds. (MIT, 2013).

A. Yamashita, M. Fujii, and T. Kaneko, “Color registration of underwater images for underwater sensing with consideration of light attenuation,” in Proceedings of 2007 IEEE International Conference on Robotics and Automation (IEEE, 2007), pp. 4570–4575.

G. Johnsen, Z. Volent, H. Dierssen, R. Pettersen, M. Ardelan, F. Sreide, P. Fearns, M. Ludvigsen, and M. Moline, “Underwater hyperspectral imagery to create biogeochemical maps of seafloor properties,” in Subsea Optics and Imaging, J. Watson and O. Zielinski, eds. (Woodhead Publishing, 2013), pp. 508–535.
[Crossref]

X. Li and Z. Ceng, Geometrical Optics, Aberrations and Optical Design, 2nd ed. (Zhejiang University, 2007).

S. Lin and L. Zhang, “Determining the radiometric response function from a single grayscale image,” in Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition (IEEE, 2005), pp. 66–73.

A. H. Munsell, “Munsell color system”, http://munsell.com/

Thorlabs, “Optical filters”, http://www.thorlabs.com/

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (12)

Fig. 1
Fig. 1 Illustration of narrowband spectral imaging in air and underwater. The image is enlarged due to refraction of water at the same object distance. The brightness of the image is reduced due to water attenuation as well as enlargement of the image.
Fig. 2
Fig. 2 Refraction at the interfaces between water, glass and air. Due to refraction of water, the object looks as if it were moved closer. Therefore the equivalent object distance zw is defined.
Fig. 3
Fig. 3 The experimental setup consists of a mobile phone, a water tank and a narrowband spectral imaging system as shown in (a) and (b). The mobile phone is placed in a waterproof box, acting as a luminous underwater object. The pattern displayed in the screen of the phone consists of 60 color pieces in 6 rows and 10 columns. The filters in the spectral imaging system are tuned by the rotation filter wheel with the wavelength scanning from 420 nm to 700 nm at an interval of 20 nm. The typical transmission of a filter with a central wavelength of around 460 nm [33] is shown in (c).
Fig. 4
Fig. 4 Response of the spectrometer on the color piece (3,7) on the phone screen. The intensity at wavelengths of 447 nm, 540 nm, and 660 nm is recorded for 3500 scans (i.e., 175 s), showing a stable light output by the phone screen.
Fig. 5
Fig. 5 Linear fitting between the brightness of underwater images and the brightness of air images, for estimation of k(λ,l) and b(λ,l), at wavelengths of 460 nm and 620 nm and underwater distances of 60 cm, 160 cm and 260 cm.
Fig. 6
Fig. 6 Exponential fitting between k(λ,l) and underwater distance l, for the estimation of coefficients α(λ) and β(λ). The RSD of the calibration data set is 3.8% and 3.2% for 460 nm and 620 nm, respectively. For the test distances, RSD is 6.4% for 460 nm and 3.6% for 620 nm.
Fig. 7
Fig. 7 Exponential fitting between b(λ,l) and underwater distance l, for the estimation of coefficients ν(λ), κ(λ) and γ(λ). The RSD of the calibration data set is 6.3% and 2.0% for 460 nm and 620 nm, respectively. For the test distances, RSD is 9.5% for 460 nm and 2.9% for 620 nm.
Fig. 8
Fig. 8 Comparison between raw underwater images, restored underwater images and images in air. Brightness and size of underwater images decrease with underwater distance. By restoration, the brightness of underwater images is improved significantly.
Fig. 9
Fig. 9 Comparison among raw underwater spectrum, restored underwater spectrum and spectrum in air, for color pieces (1,5), (2,2) (3,9) and (4,7). The underwater distance is 240 cm. The restored spectra almost overlap the spectra in air, indicating an accurate compensation of spectral energy loss due to water.
Fig. 10
Fig. 10 Color images are constructed, based on raw underwater spectral images captured at an underwater distance of 240 cm (a), restored underwater spectral images with image size unchanged (b), restored underwater spectral images with image size changed to the same as air image and image brightness scaled accordingly (c), spectral images captured in air at an object distance of 340 cm (d). The improvement by restoration is clearly visible in color images.
Fig. 11
Fig. 11 Dependence of the relative restoration error ε on the underwater distance (a) and wavelength (b). The restoration error is averaged over wavelength for certain distance in (a) and averaged over distance for certain wavelength in (b). The spectrum of the mobile phone screen emitting white light is measured by the fiber spectrometer and depicted in (b), showing abrupt change in the intensity in a wavelength range of [420 nm, 480 nm].
Fig. 12
Fig. 12 Sensitivity of the restoration error for images at 460 nm to variation in the optical properties of water (a) and to the error in underwater distance measurement (b).

Equations (27)

Equations on this page are rendered with MathJax. Learn more.

x = f z f x , y = f z f y , z = f z f z .
E ( x , y , λ , z ) = π 4 ( D f ) 2 C τ l ( λ ) τ f ( λ ) ( z f z ) 2 G ( z ) L ( x , y , λ ) cos 4 θ ,
cos 2 θ = x 2 + y 2 z 2 .
E ( x , y , λ , z ) = C G ( z ) τ l ( λ ) τ f ( λ ) L ( x , y , λ ) .
I ( i , j , λ c , z ) = a t S i , j λ c + Δ λ 2 λ c Δ λ 1 q ( λ ) E ( x , y , λ ) d λ d x d y = a t S i , j λ c + Δ λ 2 λ c Δ λ 1 q ( λ ) C G ( z ) τ l ( λ ) τ f ( λ ) L ( x , y , λ ) d λ d x d y ,
I ( i , j , λ c , z ) = a t S i , j d x d y A C G ( z ) λ c Δ λ 1 λ c + Δ λ 2 q ( λ ) τ l ( λ ) τ f ( λ ) L ( x , y , λ ) d λ H ( x , y , λ c ) , = A C G ( z ) H ( x , y , λ c ) .
x w = f z w f x , y w = f z w f y , z w = f z w f z w ,
z w = l + l n w + d n g ,
E w ( x w , y w , λ , z w ) = β ( λ ) e α ( λ ) l C G ( z w ) τ l ( λ ) τ f ( λ ) L ( x , y , λ ) + E b ( λ ) e ν ( λ ) l + E s ( λ ) ,
I w ( i w , j w , λ c , z w ) = a t S i , j λ c + Δ λ 2 λ c Δ λ 1 q ( λ ) E w ( x w , y w , λ , z w ) d λ d x d y = A C G ( z w ) λ c Δ λ 1 λ c + Δ λ 2 β ( λ ) e α ( λ ) l τ l ( λ ) τ f ( λ ) L ( x , y , λ ) d λ + A λ c Δ λ 1 λ c + Δ λ 2 E b ( λ ) e ν ( λ ) l d λ + A λ c Δ λ 1 λ c + Δ λ 2 E s ( λ ) d λ ,
α ( λ ) α ( λ c ) , β ( λ ) β ( λ c ) , E b ( λ ) E b ( λ c ) , ν ( λ ) ν ( λ c ) , E s ( λ ) E s ( λ c ) ,
I w ( i w , j w , λ c , z w ) β ( λ ) e α ( λ ) l A C G ( z w ) λ c Δ λ 1 λ c + Δ λ 2 τ l ( λ ) τ f ( λ ) L ( x , y , λ ) d λ + A E b ( λ c ) ( Δ λ 1 + Δ λ 2 ) κ ( λ c ) e ν ( λ c ) l + A E s ( λ c ) ( Δ λ 1 + Δ λ 2 ) γ ( λ c ) = β ( λ c ) e α ( λ c ) l A C G ( z w ) H ( x , y , λ c ) + κ ( λ c ) e ν ( λ c ) l + γ ( λ c ) ,
I w ( i w , j w , λ c , z w ) = G ( z w ) G ( z ) β ( λ c ) e α ( λ c ) l k ( λ c , l ) I ( i , j , λ c , z ) + κ ( λ c ) e ν ( λ c ) l + γ ( λ c ) b ( λ c , l ) .
{ I w ( i w , 1 , j w , 1 , λ 1 , z w , 1 ) = G ( z w , 1 ) G ( z 1 ) k ( λ 1 , l 1 ) I ( i 1 , j 1 , λ 1 , z 1 ) + b ( λ 1 , l 1 ) , I w ( i w , 2 , j w , 2 , λ 1 , z w , 1 ) = G ( z w , 1 ) G ( z 1 ) k ( λ 1 , l 1 ) I ( i 2 , j 2 , λ 1 , z 1 ) + b ( λ 1 , l 1 ) , I w ( i w , N , j w , N , λ 1 , z w , 1 ) = G ( z w , 1 ) G ( z 1 ) k ( λ 1 , l 1 ) I ( i N , j N , λ 1 , z 1 ) + b ( λ 1 , l 1 ) .
[ G w ( z w , 1 ) G ( z 1 ) I ( i 1 , j 1 , λ 1 , z 1 ) 1 G w ( z w , 1 ) G ( z 1 ) I ( i 2 , j 2 , λ 1 , z 1 ) 1 G w ( z w , 1 ) G ( z 1 ) I ( i N , j N , λ 1 , z 1 ) 1 ] D [ k ( λ 1 , l 1 ) b ( λ 1 , l 1 ) ] X = [ I w ( i w , 1 , j w , 1 , λ 1 , z w , 1 ) I w ( i w , 2 , j w , 2 , λ 1 , z w , 1 ) I w ( i w , N , j w , N , λ 1 , z w , 1 ) ] Y ,
X ^ = ( D T D ) 1 D T Y ,
β ( λ 1 ) [ e α ( λ 1 ) l 1 e α ( λ 1 ) l 2 e α ( λ 1 ) l M ] P ( α ) = [ k ( λ 1 , l 1 ) k ( λ 1 , l 2 ) k ( λ 1 , l M ) ] K .
α ^ ( λ 1 ) , β ^ ( λ 1 ) = a r g min α * , β * K β * P ( α * ) 2 2 J ,
κ ( λ 1 ) [ e ν ( λ 1 ) l 1 e ν ( λ 1 ) l 2 e ν ( λ 1 ) l M ] Q ( ν ) + γ ( λ 1 ) = [ b ( λ 1 , l 1 ) b ( λ 1 , l 2 ) b ( λ 1 , l M ) ] B .
ν ^ ( λ 1 ) , κ ^ ( λ 1 ) , γ ^ ( λ 1 ) = a r g min v * , κ * , γ * B γ * κ * Q ( ν * ) 2 2 J ,
I ˜ w ( i w , j w , λ c , z w ) = β ^ 1 ( λ c ) e α ^ ( λ c ) l ( I w ( i w , j w , λ c , z w ) κ ^ ( λ c ) e ν ^ ( λ c ) l γ ^ ( λ c ) ) ,
I e ( i , j , λ c , z ) = | I ( i , j , λ c , z ) G ( z ) G ( z w ) I ˜ w ( i w , j w , λ c , z w ) I ^ ( i , j , λ c , z ) | ,
ε ( i , j , λ c , z ) = I e ( i , j , λ c , z ) I ( i , j , λ c , z ) × 100 % .
Intensity variation = I max ( λ ) I min ( λ ) I m e a n ( λ ) ,
I n = 50 t I 0 255 ,
RSD ( y ^ , y ) = std ( y ^ y ) std ( y ) × 100 % .
I c ( x , y ) = Σ λ = 420 700 I ( x , y , λ ) S c ( λ ) T ( λ ) S m ( λ ) max { Σ λ = 420 700 I ( x , y , λ ) S c ( λ ) T ( λ ) S m ( λ ) } , c ( r , g , b ) ,

Metrics