Recognition and Classification of Maize Drought on UAV Hyperspectral Images
OPTOSKYBLOGBlogsHyperspec Blog
Recognition and Classification of Maize Drought on UAV Hyperspectral Images
Recognition and Classification of Maize Drought on UAV Hyperspectral Images
author: Gavin
2022-01-06

Traditional crop drought identification methods mostly use satellite multi-spectral remote sensing images to calculate vegetation index to obtain drought conditions. However, due to the limitations of weather, transit time, low spatial resolution and other factors, satellite images are not intimidating in terms of timeliness and accuracy. satisfy. With the development of unmanned aerial vehicle technology, it has become possible to obtain ultra-high-resolution remote sensing images with convenient operation, flexibility and effectiveness. In recent years, it has provided new ideas for solving many agricultural problems. With the development of deep neural networks, the proposal of fully convolutional neural networks has further improved the accuracy of remote sensing image semantic segmentation tasks.
Principle
The cause of agricultural drought is that the soil moisture content at the roots of the plants is reduced, and the plants cannot absorb enough water molecules to grow. Severe water loss will reduce the transpiration of crops, and the stomata on the leaves will shrink or even close, causing crops to wither. Therefore, both surface temperature and vegetation index can be used to monitor the drought of crops.Vegetation information is mainly reflected in red (R) and near infrared spectroscopy (Near Infrared, NIR). The normalized difference vegetation index (NDVI) calculated through these two bands can effectively feed back the growth status of vegetation.
Solution
The data in this study is a four-channel (RGB+NIR) multispectral image taken by the DJI Jingwei-M600-PRO drone. The drone is equipped with a hyperspectral imager ATH9010 produced by Optosky Photonics.Use the NDVI values calculated in the red (R) and near-infrared (NIR) bands to make labels, and train on the drone RGB image samples through the semantic segmentation method based on deep learning, so that the trained network can be used in ordinary consumer-level unmanned The RGB image taken by the machine is used to identify and classify the corn drought.

The overall process of the solution is as follows:
(1) Selection of original data: This paper selects corn image data in villages and towns affected by drought in early autumn for research.
(2) Perform preprocessing such as screening and splicing of data; divide training set, validation set and test set; crop the training set and validation set samples, and perform edge mirroring and area overlap cropping on the test set samples.
(3) Manually divide the non-maize planting areas such as roads and houses, that is, the "other" category; use the red (R) and near-infrared (NIR) bands to calculate the NDVI index of the corn area and perform smoothing processing, combined with expert evaluation for corn Evaluation of the drought level, generating a disaster level label.
(4) Build a network model.
(5) Input the training set and validation set samples into the network for training, and set reasonable training parameters.
(6) Test the trained network on the test, and use evaluation indicators such as Fl-score>Jaccard coefficient to evaluate the accuracy of the network.
(7) Optimize the structure and parameters of the network, and repeat steps (4) to (6) for comparison experiments, until the test set scores meet the requirements and no longer increase significantly, and the optimal network model is obtained.
(8) Visualize test results.
Test result
Principle
The cause of agricultural drought is that the soil moisture content at the roots of the plants is reduced, and the plants cannot absorb enough water molecules to grow. Severe water loss will reduce the transpiration of crops, and the stomata on the leaves will shrink or even close, causing crops to wither. Therefore, both surface temperature and vegetation index can be used to monitor the drought of crops.Vegetation information is mainly reflected in red (R) and near infrared spectroscopy (Near Infrared, NIR). The normalized difference vegetation index (NDVI) calculated through these two bands can effectively feed back the growth status of vegetation.
Solution
The data in this study is a four-channel (RGB+NIR) multispectral image taken by the DJI Jingwei-M600-PRO drone. The drone is equipped with a hyperspectral imager ATH9010 produced by Optosky Photonics.Use the NDVI values calculated in the red (R) and near-infrared (NIR) bands to make labels, and train on the drone RGB image samples through the semantic segmentation method based on deep learning, so that the trained network can be used in ordinary consumer-level unmanned The RGB image taken by the machine is used to identify and classify the corn drought.

The overall process of the solution is as follows:
(1) Selection of original data: This paper selects corn image data in villages and towns affected by drought in early autumn for research.
(2) Perform preprocessing such as screening and splicing of data; divide training set, validation set and test set; crop the training set and validation set samples, and perform edge mirroring and area overlap cropping on the test set samples.
(3) Manually divide the non-maize planting areas such as roads and houses, that is, the "other" category; use the red (R) and near-infrared (NIR) bands to calculate the NDVI index of the corn area and perform smoothing processing, combined with expert evaluation for corn Evaluation of the drought level, generating a disaster level label.
(4) Build a network model.
(5) Input the training set and validation set samples into the network for training, and set reasonable training parameters.
(6) Test the trained network on the test, and use evaluation indicators such as Fl-score>Jaccard coefficient to evaluate the accuracy of the network.
(7) Optimize the structure and parameters of the network, and repeat steps (4) to (6) for comparison experiments, until the test set scores meet the requirements and no longer increase significantly, and the optimal network model is obtained.
(8) Visualize test results.
Test result
NO | NDVI | Extent of disaster |
0 | 0.5 〜1 | Unaffected |
1 | 0.42 〜0.5 | Mild drought |
2 | 0.36 〜0.42 | Moderate drought |
3 | 0 〜0.36 | Severe drought |

口Unaffected ■Mild drought 口Moderate drought ■Severe drought ■other
Before slicing the test image, copy 50 pixels of each edge as mirror filling. The reasoning process uses overlapping sliding window slices as input, the edges overlap by 100 pixels, and crop the 512x512 image input network . The output size of the network inference is also 512X512, discarding the 50 pixels of the edge, leaving the size of the middle area as 412X412, and splicing the results together.


Conclusion
In this article, we focus on how to use the RGB images obtained by drones to identify corn droughts and classify droughts through deep learning semantic segmentation. First, by analyzing the four-channel UAV data, combined with expert daily interpretation to generate a maize crop disaster level distribution map. Secondly, according to the existing tags, combined with the semantic segmentation network of three-channel input, the disaster classification of deep learning is realized, which proves the feasibility of deep learning in disaster analysis. Finally, by modifying the loss function, optimizing the U-Net structure and other operations, the performance of disaster area recognition is improved.
Related products
Vegetation information is mainly reflected in red (R) and near infrared spectroscopy (Near Infrared, NIR)
Hyperspectral Camera:ATH1010
Airborne Hyperspectral Remote Sensing System:ATH9010
Drone Hyperspectral Imaging System:ATHL9010
Related articles


Conclusion
In this article, we focus on how to use the RGB images obtained by drones to identify corn droughts and classify droughts through deep learning semantic segmentation. First, by analyzing the four-channel UAV data, combined with expert daily interpretation to generate a maize crop disaster level distribution map. Secondly, according to the existing tags, combined with the semantic segmentation network of three-channel input, the disaster classification of deep learning is realized, which proves the feasibility of deep learning in disaster analysis. Finally, by modifying the loss function, optimizing the U-Net structure and other operations, the performance of disaster area recognition is improved.
Related products
Vegetation information is mainly reflected in red (R) and near infrared spectroscopy (Near Infrared, NIR)
Hyperspectral Camera:ATH1010
Airborne Hyperspectral Remote Sensing System:ATH9010
Drone Hyperspectral Imaging System:ATHL9010
Related articles
- Shiferaw B, Prasanna B M, Hellin J, et al. Crops that feed the world 6. Past successes and future challenges to the role played by maize in global food security [J]. Food Security, 2011, 3(3): 307.
- Campos H, Cooper M, Habben J E, et al. Improving drought tolerance in maize: a view from industry [J]. Field Crops Research, 2004, 90(1):0-34.
- Wu T, Zhang L, Peng B, et al. Real-time progressive hyperspectral remote sensing detection methods fbr crop pest and diseases[C]// SPIE Commercial + Scientific Sensing and Imaging. International Society for Optics and Photonics, 2016.
- Howard J A, Barrett E C, Heilkema J U. The application of satellite remote sensing to monitoring of agricultural disasters [J]. Disasters, 1978, 2(4):231-240.
- Xiang T Z, Xia G S, Zhang L. Mini-Unmanned Aerial Vehicle-Based Remote Sensing: Techniques, applications, and prospects [J]. IEEE Geoscience and Remote Sensing Magazine, 2019, 7(3): 29-63.
- Linchant J, Lisein J, Semeki J, et al. Are unmanned aircraft systems (UASs) the future of wildlife monitoring? A review of accomplishments and challenges [J]. Mammal Review, 2015, 45(4):239-252.
Related News
Precision fertilization by UAV for rice at tillering stage in cold region based on hyperspectral remote sensing prescription map
2023-02-01 375New product|recommendation-GF900 UAV-borne laser methane telemetry system
2023-08-30 54Application of UAV-hyperspectral imaging for rice growth monitoring
2023-01-18 508Application of hyperspectral camera in industrial inspection
2023-01-17 337Estimation Scheme of Rice Yield Based on UAV Hyperspectral Images
2023-01-17 287Surveillance of pine wood nematode disease based on satellite remote sensing images
2023-01-16 315UAV forest fire patrol protection plan
2023-01-16 318Application of Hyperspectral Imager in Disguised Target Recognition(part 2)
2023-01-13 356Application of hyperspectral imager in detection of exogenous pests in jujube fruit
2023-01-11 375Application of UAV Hyperspectral in Garbage Sorting
2023-01-10 343Application of Hyperspectral Imager in Disguised Target Recognition(part 1)
2023-01-10 306What is the hyperspectral imaging?
2023-01-05 334Study on the degree of damage to the Asian carlocust with hyperspectral remote sensing model
2023-01-03 300Nitrogen detection in cotton leaves based on hyperspectral
2022-12-23 288FAQ_HYPERSPECTAL IMAGER FAQ
2022-08-22 953Using ATHL9010 to estimate above-ground carbon stocks at the single-tree scale in subtropical forests
2022-02-25 1083Airborne Hyperspectral Imaging in Early Monitoring of Pine Wood Nematode
2022-01-14 877A Study on Eutrophication of Lake Based on Hyper-spectraI Remote
2022-01-07 628Application of Hyperspectral in the Construction of Hongshan Polymetallic Prospecting Model
2022-01-07 580Vegetation Classification of Alpine Grassland in Qinghai Lake Basin Based on HSI hyperspectral remote sensing data
2022-01-07 614