1. Introduction
Recently, unmanned aerial vehicles (UAVs) have been introduced into agricultural research to monitoring crops [
1]. In contrast to satellite imagery and aircraft-based remote sensing, UAVs can be used frequently during the entire growth period. The main benefits are simple mission planning, instantaneous operation with low man power and imaging below cloud cover [
2]. By carrying low-cost commercial camera systems, UAVs provide ultra-high resolution images of the crop canopy due to the low flight altitude. Current developments in photogrammetric algorithms are specifically adapted to the needs of UAV imagery. Thousands of images can be nearly automatically processed via ready-to-use software to produce orthoimages or surface models. Because wheat is the crop grown with the highest acreage [
3] there is a strong interest in obtaining spatial and temporal information about the wheat canopy in high resolution, e.g., to adapt nitrogen and pesticide application site-specifically to improve production efficiency [
4,
5,
6].
The periodic monitoring of the canopy biophysical parameters, such as biomass, leaf area index (LAI) and plant height (PHT), is essential to understanding crop development, variations in canopy reflectance and net ecosystem exchange or nitrogen and pesticide demand during the growth season [
7,
8,
9]. These parameters and derivatives are important for precision agriculture, remote sensing, crop modeling, ecosystem modeling and climate modeling. Crop growth models use a wide range of biophysical parameters to estimate future yield as input and validation [
10,
11]. Biophysical parameters may further deliver vital information about the specific infection situation with fungal diseases to make field-specific decisions on plant protection [
12]. The availability of this information at the field scale would enable a new generation of decision support systems that can optimize fungicide application in winter wheat, e.g., the prototype “proPlant expert” precisely recommends maximum application rates for up to three management zones within a field according to the yield expectation [
7].
The monitoring of nitrogen (N) is an essential tool for investigating many metabolic and structural processes in maturing wheat plants, such as yield formation and health status. Because N is not immobilized in soils and an abundant reservoir of plant-available N is not present, it is important for optimal crop production to supply N by applying fertilizer throughout plant growth. However, excessive use of fertilizer eventually leads to unwanted N-leaching into groundwater or N-contamination of surface run-off water, contributing to environmental pollution, which is to be avoided [
13]. The N status cannot be estimated from the leaf nitrogen content alone but biophysical parameters of the wheat canopy should also be taken into account. For example, the nitrogen nutrition index (NNI), i.e., the ratio between the actual N concentration (N
t) observed in the plants and the critical N concentration related to dry biomass (N
ct), has been used to reliably characterize the N status of wheat crops during the vegetative period [
14]. NNI values >1 indicate excessive N supply (over-fertilization), whereas suboptimal N supply is indicated by values <1. Especially during flowering and the grain-filling growth stage, NNI is a good indicator of N deficiency affecting grain yield and grain protein [
14,
15].
Due to the complex interrelationship between many environmental factors, such as soil heterogeneity, cultivation and land surface, the parameters described above show high spatial and temporal variability so that a high measurement density would be needed to reflect their spatial patterns within the field [
16]. It has been shown that site-specific management strategies in the context of precision farming increase management efficiency [
4]. However, accurate measurements of these parameters are time consuming or destructive. For example, the calculation of the NNI involves the cutting of fresh biomass, subsequent drying and the determination of N
t using the Dumas or Kjeldahl method in the laboratory [
17].
To become more efficient, indirect methods using sensors for estimating those parameters have been proposed and implemented. At the plot scale, sensor principles are available that enable on-spot measurements without destructing the canopy, mostly with direct contact to wheat plants. For example, leaf chlorophyll meters relate the light transmittance or fluorescence on leaf parts to chlorophyll content and, by using certain indices, to leaf N
t [
18]. The LAI can be modeled by devices measuring sunlight interception in the wheat canopy using radiative transfer models [
19]. Most of these principles involve time-demanding measurements or sophisticated measurement protocols that can only be performed manually in a stop-and-go mode. Their use for online or high-throughput measurements was discussed only very recently in research, e.g., mobile LAI determination by sunlight interception [
20].
For the real-time determination of biophysical parameters and N status, many active sensing approaches have been studied, such as LiDAR, optical sensors and ultra-sound sensors [
21,
22,
23,
24], in addition to more refined solutions, e.g., the pendulum operating measurement principle Crop-Meter [
25,
26]. These sensors are mainly installed within or up to two meters above the canopy and are bound to a ground-driven vehicle such as a tractor. They have the advantage that they deliver information about the wheat canopy immediately and in high resolution. Optical sensors mostly use specific spectral vegetation indices (SVI) in the visual and infrared part of the spectrum. The GreenSeeker
� (Trimble, Sunnyvale, CA, USA) detect the reflection between 656 nm (VIS) and 774 nm (NIR), whereas the CropCircle
� (Holland Scientific Inc., Lincoln, NE, USA) allows for more freedom in the choice of wavelengths by using different filters [
23]. To map larger areas, spectral vegetation indices (SVI) from satellite-based and aircraft-based remote sensing imagery have been related to those parameters [
27,
28]. In contrast to proximal sensing, these data have lower spatial and temporal resolution. UAV imagery might be able to close the gap between the plot scale, covered by manual sensors and proximal sensors, and the regional scale, covered by traditional remote sensing, because of their high spatial resolution and almost instantaneous availability for practitioners and experts in agriculture. Most studies have investigated the relationship between biophysical parameters and UAV imagery, e.g., biomass [
29,
30,
31,
32,
33], LAI [
34,
35,
36,
37,
38], plant height [
30] and grain yield [
39], whereas fewer studies have shown the relationship between nitrogen and UAV imagery [
31,
35,
40,
41] in wheat crops, so far. We summarized existing research studies relating UAV imagery with some agronomic parameters of wheat crops in
Table S1. Only Pölönen, et al. [
31] investigated dry biomass and total nitrogen content in combination with UAV hyperspectral imagery, which allowed them to calculate the under or over nourishment of the wheat plants during crop growth using the NNI. To the best of our knowledge, low cost UAV imagery has not been tried for estimating NNI in cereal crops. Moreover, many studies examined only fields or parts of fields with sizes smaller than 3 ha.
The objective of our work was therefore to study the prospects of low cost UAV imagery for monitoring biophysical plant parameters, N-content and NNI of wheat crops from a more practical viewpoint closer to agricultural routines. Three UAV missions were carried out between booting and maturity over an 11 ha field under soil-induced water deficit conditions and we acquired a large set of aerial images with a consumer level camera. Based on that, ultra-high resoluted orthoimages and surface models were computed through photogrammetric processing. We demonstrated the applicability of UAV imagery for recognizing the spatio-temporal patterns of the wheat canopy development, as observed in this field. Furthermore, we investigated the relationship of biophysical parameters, i.e., plant height, LAI and biomass, as well as, nitrogen status, with the image variables that were derived from the UAV imagery at specific dates. Regarding this relationship, we determined its strength to model the spatial variability of those parameters with linear regression in order to fulfil precision agricultural needs. This is important in order to establish a monitoring system for crop variability that allows fast production of maps for end users such as farmers or agricultural services adopting precision agriculture. This may improve typical applications of precision agriculture such as variable rate fertilization or precision plant protection.
2. Materials and Methods
2.1. Test Site
The study was conducted within a field in Eastern Germany during the spring season in 2015 (51°49′N, 12°42′E). The soil development in the field was influenced by recent flood plain deposits of the Elbe River. The main soil type is a fluvic cambisol and the soil texture varies between sand, loamy sand and sandy loam. The crop grown was winter wheat (Triticum aestivum L. var. ‘Linus’). The seed row distance was 0.12 m, and the average crop density was between 440 and 480 ears·m−2.
2.2. UAV System and Flight Missions
We used a hexacopter (P-Y6, Hexapilots, Dresden, Germany) to carry a commercial camera system to acquire the aerial images for this study (
Figure 1). The hexacopter consisted of six propellers aligned in a three arm mounting with two propellers attached to each arm using the push/pull principle for failure safety. The navigation control system was a DJI Wookong M (DJI Innovations, Shenzhen, China) with an integrated GNSS receiver, which enabled user-defined waypoint tracking. Power supply was ensured with lithium polymer batteries (10,000 mAh/5 s). With this setup, including the camera, approximately 15 min flight duration was achieved.
For image acquisition, we used a Sony Nex 7 point and shoot camera with the following specifications: 24 megapixels, 23.5 × 15.4 mm sensor and E16 mm F2.8 fixed lens (Sony Corporation, Tokyo, Japan). The camera was mounted onto a gimbal underneath the copter. The DJI software triggered image capturing, and the position of the copter was recorded by a GPS device.
The flight missions were planned to cover approximately 13 ha of ground area. The orthoimages and surface models were later clipped to 11 ha to correspond to the study area. Parallel tracks were flown with a between-track distance of 11 m at an altitude of 50 m which is a side-lap of 60%. The frequency of image capture was set to correspond to 60% image overlap. All images were taken from near nadir position with a ground resolution of approximately 0.012 m·px−1.
2.3. Ground Truthing
Three flight missions (M1-3) were conducted over the wheat canopy during crop growth from booting of the wheat plants until mature growth stages. An additional flight mission (M4) was performed after tillage to yield a UAV image of the ground surface. Mission dates, weather conditions and objectives of the flight missions are enlisted in
Table 1.
For each mission, 20 locations were chosen with respect to the wheat variability observed at that date. These locations were selected along the tractor lanes by eye. Locations were at least 2 m distance from the tractor lane in order to be non-influenced by the clearing. At each plot, the ground area spanned by 1 × 1 m² was taken as the sample size. For georeferencing the UAV images, 20 panels were laid out along a regular grid over the field within the tractor lanes with good visibility from above before the flight missions. Panels and plots were located using the differential GPS HiPer Pro system (Topcon Positioning Systems, Inc., Livermore, CA, USA) having a relative horizontal and vertical accuracy of 3 mm and 5 mm.
At each plot, the plant height (PHT), LAI, fresh biomass (FBM), dry biomass (DBM) and N
t were determined as agronomic reference measurements. The descriptive statistics and the abbreviations used in this paper are summarized in
Table 2. Wheat plants were classified according to the BBCH growth stages code of Lancashire et al. [
42]. Measurements of PHT were integrated by the measured heights of the wheat plants using a folding yardstick at 10 random locations within the plot. LAI was measured using the SunScan Canopy Analysis System type SS1 (Delta-T Devices, Cambridge, UK). The LAI probe was positioned at 45° to the direction of the seed rows under the canopy, and the reference sensor was in immediate proximity above the canopy without disturbing the incoming sunlight. The LAI measurement was taken as an average of 10 individual measurements with the probe repositioned each time within the plot. All wheat plants within the plot area were then cut directly above the ground using a short reaping hook. FBM was determined immediately after cutting. DBM was measured by drying biomass samples at 60 �C for 48 h in a compartment drier. The dry plant material was milled and analyzed for N
t following the dry combustion method of Dumas [
43] using an elemental analyzer (Vario MAX CN Elemental Analyser, Elementar, Hanau, Germany).
2.4. Image Pre-Processing
All aerial images were recorded in RAW data format and lossless converted to the tagged image file format (TIFF) in standard RGB color space. The full camera calibration matrix, including non-linear distortion coefficients, was estimated for the camera sensor using the Agisoft Lens software (v. 0.4.2, Agisoft LLC, St. Petersburg, Russia) by repetitively taking photographs of a checkerboard pattern. The Agisoft Lens software uses a pinhole camera model for lens calibration, and the distortion correction is modeled by Brown’s distortion model. We used a number of radiometric pre-processing steps to improve the final UAV image mosaicking and the surface model generation results, as shown below. These algorithms were programmed in Matlab 2015b. To reduce colorimetric alterations, the following corrections were conducted only on the value channel of the HSV transformed aerial images. The corrected HSV images were then transformed back to RGB space.
To diminish the effect of brightness reduction from the image center to the borders due to the sensor optics, all images were empirically corrected using the vignette correction algorithm proposed by Zheng et al. [
44]. This processing involved calculating a mean image from a selection of homogeneous aerial images (n = 80). In our case, images were selected that depicted only the wheat canopy without prominent features such as trees. The further pre-processing steps were performed on the vignette-corrected images.
For better photogrammetric processing, contrast-enhanced images were produced from each aerial image. This was achieved by contrast limited adaptive histogram equalization (CLAHE). Each image was divided into eight tiles. Within each tile, the contrast was enhanced by histogram equalization following the Gaussian distribution. To avoid artificial boundaries, the neighboring tiles were then joined using bilinear interpolation. The images corrected by CLAHE were used during the mosaicking process and surface model generation.
Different incidence and viewing angles may result in unwanted radiometric variations in the aerial images related to the bidirectional reflectance distribution function (BRDF). This effect was reduced following the empirical BRDF correction algorithm suggested by Lelong et al. [
35]. In short, each image was block-wise averaged to a smaller representation of the image by bilinear interpolation, then smoothed by Gaussian filtering and finally over-sampled onto the original image with bi-cubic interpolation. Then, the inverted, smoothed values were subtracted from the original values. Specifically, all images were subdivided into 400 × 400 pixel blocks, and the Gaussian filter size used was a 3 × 3 pixel window. The BRDF set of images was used for the final texturing of the orthoimage.
2.5. Photogrammetric Processing
The vignette-corrected and CLAHE pre-processed images were used in the semi-automated processing flow of the photogrammetry software Agisoft PhotoScan (v. 1.2.4, Agisoft LLC, St. Petersburg, Russia) to generate orthoimages and surface models for M1-4. The software implements Structure from Motion to estimate a 3D point cloud from the overlapping aerial images [
45]. Using feature detection and description algorithms, key points between overlapping images that bear geometrical similarities in their immediate surroundings were selected. In the first step, a sparse 3D point cloud was generated to align the images and estimate the exact camera positions. We used the options ‘high’ and ‘generic’. In the second step, a dense, multi-view stereo reconstruction on the aligned image set operating on the pixel level was applied that generated a dense 3D point cloud. For this, we used the options ‘medium’ and ‘mild’ depth filtering. We used this setting because higher quality takes exponentially more time and demands more computational resources. Prior tests with the software showed “medium” as a good compromise with adequate processing time (4.6 cm/pix resolution error). The mesh was generated using the option ‘height field’ to produce the orthoimages and the surface models. To texture the orthoimages, the CLAHE pre-processed images were exchanged with the BRDF-corrected images. The texturing was performed with the software-implemented color correction. The ground resolution of the orthoimages was 0.025 m·px
−1.
2.6. Extraction of Image Variables
We calculated a set of image variables from the orthoimages and the surface models at each measurement plot to relate them to the agronomic reference measurements. The image variables were computed from averaging the pixels within the plot areas. Furthermore, crop pixels were identified to calculate crop coverage (CVR). This was achieved by first converting the respective orthoimages into the LAB color space because we found the best discrimination between soil and vegetation in the a-vector corresponding to the red-green variation of the images. Secondly, within the whole orthoimage, we randomly selected regions of interest (ROI) containing only soil-related pixels. These ROIs were combined into a simple vector and the value given at the 90% percentile was used as a threshold value to differentiate between soil and vegetation pixels in the measurement plots. CVR was computed as the percentage of all vegetation pixels identified within a plot. All image variables are described in
Table 3.
2.7. Statistical Analysis
The relationships between the image variables and reference measurements were investigated and tested for significant correlation using a Pearson correlation matrix for M1-3. Principal component analysis (PCA) was conducted to explore the structural variation and dimensionality within all image variables for M1-3. PCA was calculated using the correlation matrix to eliminate the influence of different standard deviations among the image variables. Loading plots were used to summarize the interrelationship between the image variables. Linear regression analysis was conducted using the agronomic reference measurements as dependent variables. As independent variables, the scores from the principal components (PC) that contributed sufficiently to the overall variance of the image variables were used to prevent multi-collinearity in the regression models. The best models were chosen by backward selection. Variables with p > 0.1 were deleted from the set of independent variables [
46]. For model comparison, we reported the adjusted R
2 values according to Wherry [
47].
The
NNI was calculated using the approach of Lemaire and Salette [
48]:
where
Nct is the critical N concentration related to a specific dry biomass (DBM). Justes et al. [
15] specified the dilution curve for
Nct statistically on wheat crops by proposing values for
a = 5.35 and
b = 0.442, which were used for the NNI determination in this study.
All models were validated using an independent model and validation data set by splitting the field equally in two halves orthogonal to the tractor lane direction. As validation criteria, the R
2 of validation (
R2val) expressed as the squared Pearson correlation coefficient, the root mean squared error (
RMSE) and the mean error (
ME) were calculated:
where
x and
y denotes the reference and predicted values and
and
the means of the reference and predicted values at the validation locations,
n the number of validation points and
cov and
var the covariance and variance, respectively.
4. Discussion
Compared to studies investigating proximal sensing to estimate biophysical crop parameters, the UAV image variables were competitive. Several studies tested LiDAR systems from a tractor as an online proximal sensing system for adapting application on the go. They related either the vegetation volume or the reflection height (distance between crop surface and laser scanning unit) derived by triangulation or time-of-flight of the reflected laser beam with biomass and LAI measurements. The studies reported R
2 values for FBM between 0.77 and 0.99, for DBM between 0.72 and 0.99 and for LAI between 0.70 and 0.96 [
22,
54,
55]. Sensing LAI was further investigated using a combined radiometer and ultra-sound sensing system yielding an R
2 value of 0.84 [
24] and a mobile LAI system based on canopy light transmittance yielding R
2 values ranging from 0.73 to 0.86 [
20] between the sensor and LAI reference measurements.
These sensing approaches were not developed to map entire fields but to calculate application rates from sensor measurements online. Similarly, UAV systems might be used this way by having a cable-tethered UAV system flying in front of the application unit. This would enable greater influence on the field of view because the UAV system’s flight altitude would not be fixed, in contrast to the online sensors presented above [
56]. Nevertheless, offline approaches, which are more related to UAV, can represent an entire field as a parameter map. Specifically satellite remote sensing products have been related to various biophysical parameters due to their large spatial coverage and their possibility for regional and global upscaling. Studies that related common SVIs used in satellite-based remote sensing such as normalized difference vegetation index (NDVI) or enhanced vegetation index (EVI) with biomass and LAI reported relationships in a wide range from non-significant to highly significant. These results depended on the specific SVI, the spatial and spectral resolution of the satellite system and the growth stage [
27,
28,
57,
58]. The newer commercial satellites make it now possible to depict the earth’s surface in sub-meter resolution. WorldView-3 (DigitalGlobe Inc., Longmont, CO, USA.), for example, delivers imagery with 0.30 m (panchromatic) and 1.1 m (RGB) ground resolution [
58] with a revisiting time of less than five days. These properties are quite competitive with airborne or even UAV platforms. However, cloud coverage may limit the use of satellites for monitoring during crop growth. In addition, high scene prices and/or special order rules such as selling only large areas (10 × 10 km) may hinder the adoption of these data for precision agriculture.
UAV systems can include information about the canopy volume of the wheat crops estimated from point clouds calculated by the overlapped UAV imagery in addition to the spectral information contained in the image tone which might be well suited to estimate biophysical crop parameters. Bendig et al. [
29] showed highly linear relationships between crop height measurements and plant heights calculated from UAV surface models with an R
2 of 0.92 over multiple growth stages in a controlled test site with different cultivars and N-treatments. Grenzdörffer and Zacharias [
59] reported an R
2 between 0.60 and 0.72 within different cultivars. For comparison, when we pooled all the plant height measurements from the missions into one data set, we obtained a relationship with an R
2 of 0.85 (
Figure S2). Hunt et al. [
60] related the NIR, green and blue signal from a commercial camera system recorded from an airborne platform with DBM and LAI measurements and reported significant relationships with an R
2 of 0.65 and 0.85, respectively. LeLong et al. [
35] modeled LAI with NDVI on the basis of UAV imagery. They reported an RMSE between 0.5 and 0.6 LAI shortly before and after flowering of durum and bread wheat.
In contrast to the biophysical UAV models, estimating the N status from UAV imagery was only moderately competitive with studies investigating proximal or hyperspectral sensing. Erdle et al. [
23] investigated for its accuracy of N
t and NNI calculation a number of different proximal sensing systems used for online application such as the GreenSeeker
® and the CropCircle. They reported R² values between 0.41 and 0.83 for N
t or 0.52 and 0.91 for NNI shortly before flowering. Pölönen et al. [
31] investigated a hyperspectral UAV system and found an R² value of 0.72 for N
t models. The study of Chen et al. [
61] investigated the relationship between different SVIs and nitrogen status calculated from hyperspectral field measurements. They reported an error between 0.13 and 0.37% for N
t and an error between 0.13 and 0.17 for NNI shortly before booting. The R
2 ranged from 0.28 to 0.92, with an error of 0.82 to 0.90 for NNI. However, they observed a larger range of N
t and NNI values compared to the range in this study. They concluded that a mechanistic model that combines SVIs with a clear theoretical basis, i.e., SVIs related with N
t and DBM, should be more conclusive for the estimation of the NNI rather than the arbitrary use of SVIs, e.g., the NDVI. In the same way, UAV imagery can be considered a large improvement over airborne and satellite imagery because the combination of crop surface models with image tone variables provides a more direct approach to understand and model the NNI spatial distribution within fields. In addition, the estimation of N
t is also easier because of UAV imagery. Because the senescence of plant leaves is increasingly propagating within the wheat canopy during the later growth stages, biophysical parameters become increasingly important to describe N
t in the wheat canopy, and the use of PHT
UAV derived from crop surface models becomes more sensible for estimating N
t.
Online sensing delivers instant information about the crop variability. This is not possible with UAVs today. However, online sensing has the strong limitation that it is ground and vehicle based. The field of view of the measurement is quite narrow and linear along the driving direction. In contrast, UAV images deliver spatial information about crop variation over the entire field without interpolation as is with high spatial and temporal resolution. The offline approach of UAVs would also allow for more insight rather than the “black box” online sensing approach.
According to Colomina and Molina [
62], UAV missions should use an 80%–90% image overlap to compensate for the instability of the platform. However, we did not experience any problems with image alignment and dense cloud computing in Agisoft using the 60% overlap setting. Therefore, we decided to use this more economical approach. Using the 80% setting would have increased mission time about 50% and doubled up lithium polymer batteries (10,000 mAh/5 s). In addition, memory storage for images and processing time for photogrammetry would have been increased drastically. Having in mind that many fields in Eastern Germany are even larger than 50 ha, the 80% overlapping approach is nearly unsuitably for farmers and even for agricultural services.
For adopting UAV to precision agriculture, the processing and analyzing of UAV images should be as easy as possible. However, in this study they were a number of processing steps included that might seem overstated. According to Lelong et al. [
35] and Rasmussen et al. [
63], UAV imagery should be preprocessed to account for vignetting or BRDF effects. In this study, we used algorithms that were openly described in the UAV literature and relatively simple to implement. Applying these pre-processing algorithms on the images before photogrammetric processing had a visible effect on the final UAV scene (
Figure S4). Therefore, we decided to use the pre-processed UAV scenes for modeling. Furthermore, we extracted the plant pixel from the sample plots to relate only those with the crop parameters at first. Of course, plant separation as well as computation of the coverage would need some additional GIS analysis in order to work properly for mapping the entire image (field). Therefore it would be preferably to skip this process altogether in order to oblige to a more simplistic image processing. To prove this we extracted all image variables without segmentation on the basis of all pixels within the sample area. The average correlation strength between all image variables and the crop parameters did not change drastically. Therefore we decided to skip the plant separation step for the image variables and used it only for calculating the CVR variable (coverage).
It has to be acknowledged that the image variables were strongly interrelated with each other (
Table 4). This might influence the linear regression due to the influence of multicollinearity. To avoid this problem, one can transform the variables into an alternative space in which the transformed variables become uncorrelated with each other. This can be avoided by using PCA. It is also a useful tool to explore relationships between variables in a more holistic way since most of the variance of the image variables is summarized within a few principal components (PC). To some extent, PCA is affected by strong non-linear behavior of the input variables. In our case, we do not see severe non-linearities among the image variables itself that would justify a more sophisticated non-linear PCA analysis to reduce the dimensionality such as kernel PCA. In order to back up the linear dimension reduction approach, we used the Kayser-Meyer-Olkin criterion and calculated the measure of sampling adequacy (MSA). The MSA value for the complete correlation matrix was 0.71. This is classified as middling (quite good). No MSA value was lower than 0.6 [
64]. Of course, image variables and crop variables tend to have non-linear relationships in later growths stages. To some extent, this was reduced by summarizing the image variables via PCA into components (
Figure S5).
5. Conclusions
With low-cost UAV imagery based on an RGB consumer-level camera, we were able to compute ultra-high resolution orthoimages and surface models from a cultivated 11 ha wheat field even by using only a 60% image overlap. We observed the development of spatial patterns in the wheat canopy from booting to grain filling with three flight missions and were able to qualitatively assess the changes in the wheat canopy down to the individual plant level.
Various biophysical parameters, i.e., the leaf area index, fresh biomass, dry biomass and plant height, were highly correlated with the blue-channel-related ratios, the plant height calculated from the surface models and the plant coverage calculated from thresholding the UAV imagery. Both the image tone and the surface models derived from the UAV imagery were important to describe the spatial variability of the biophysical characteristics observed in the wheat canopies. Linear regression models for these parameters with principal components calculated from the image variables yielded R2 values between 0.70 and 0.97 for the entire data and 0.73 and 0.99 for the validation. Our study revealed that even under water-deficit conditions, UAV imagery can be used to estimate biophysical parameters properly. For example, deriving biomass or LAI maps from UAV imagery would help to establish irrigation systems more properly or improve plant disease forecasting.
Modeling the N content yielded only low R² values, with the best model obtained at flowering growth stage, with an R2 of 0.65. The red image tone variables were the most important predictors because the red signal is influenced by chlorophyll absorption and chlorophyll can be related to the N content. During the later missions, the plant height and coverage derived from the UAV images became important because with further propagation of senescence, chlorophyll content decreases in the wheat leaves. When calculating the N status with the NNI, we obtained a constant error over all missions, with 0.10 to 0.11 NNI. However, the errors obtained here were comparable to other studies conducted with hyperspectral field measurements. The R2 values might be underrepresented due to the low N variability because of the soil-induced water deficit stress conditions. Therefore, the prospects for low-cost RGB UAV imagery might be limited for wheat crops under these conditions to observe the spatial variability of N and NNI because the RGB image tone was only broadly related to changes in N during the flowering growth stage. However, the combination of morphological information, calculated from surface models, and the image tone is sensible for observing the NNI because it is dependent on the biophysical and biochemical characteristics of the wheat crops. Future studies should be focused on the relationship between UAV imagery and NNI under more variable N conditions using low cost RGB cameras and hyperspectral UAV sensing should investigate wheat crops under water-deficit conditions.