Insect detection from imagery using YOLOv3-based adaptive feature fusion convolution network

نویسندگان

چکیده

Context Insects are a major threat to crop production. They can infect, damage, and reduce agricultural yields. Accurate fast detection of insects will help insect control. From computer algorithm point view, from imagery is tiny object problem. Handling objects in large datasets challenging due small resolution the an image, other nuisances such as occlusion, noise, lack features. Aims Our aim was achieve high-performance detector using enhanced artificial intelligence machine learning technique. Methods We used YOLOv3 network-based framework, which high performing computationally detector. further improved original feature pyramidal network by integrating adaptive fusion module. For training network, we first applied data augmentation techniques regularise dataset. Then, trained features optimised hyper-parameters. Finally, tested proposed on subset dataset multi-class pest Pest24, contains 25 878 images. Key results achieved accuracy 72.10%, superior existing techniques, while achieving rate 63.8 images per second. Conclusions compared with several models regarding processing speed. The method performance both terms computational Implications demonstrates that networks provide foundation for developing real-time systems better control damage.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Adaptive Decision Fusion in Detection Networks

In a detection&#10 network, the final decision is made by fusing the decisions from local detectors. The objective of that decision is to minimize the final error probability. To implement and optimal fusion rule, the performance of each detector, i.e. its probability of false alarm and its probability of missed detection as well as the a priori probabilities of the hypotheses, must be known. H...

متن کامل

Adaptive Decision Fusion in Detection Networks

In a detection network, the final decision is made by fusing the decisions from local detectors. The objective of that decision is to minimize the final error probability. To implement and optimal fusion rule, the performance of each detector, i.e. its probability of false alarm and its probability of missed detection as well as the a priori probabilities of the hypotheses, must be known. How...

متن کامل

Satellite Imagery Classification Based on Deep Convolution Network

Satellite imagery classification is a challenging problem with many practical applications. In this paper, we designed a deep convolution neural network (DCNN) to classify the satellite imagery. The contributions of this paper are twofold — First, to cope with the large-scale variance in the satellite image, we introduced the inception module, which has multiple filters with different size at t...

متن کامل

Robust Fusion of Irregularly Sampled Data Using Adaptive Normalized Convolution

We present a novel algorithm for image fusion from irregularly sampled data. The method is based on the framework of normalized convolution (NC), in which the local signal is approximated through a projection onto a subspace. The use of polynomial basis functions in this paper makes NC equivalent to a local Taylor series expansion. Unlike the traditional framework, however, the window function ...

متن کامل

Overlap-based feature weighting: The feature extraction of Hyperspectral remote sensing imagery

Hyperspectral sensors provide a large number of spectral bands. This massive and complex data structure of hyperspectral images presents a challenge to traditional data processing techniques. Therefore, reducing the dimensionality of hyperspectral images without losing important information is a very important issue for the remote sensing community. We propose to use overlap-based feature weigh...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Crop & Pasture Science

سال: 2022

ISSN: ['1836-5795', '1836-0947']

DOI: https://doi.org/10.1071/cp21710