Multimodal attention-based deep learning for automatic modulation classification

نویسندگان

چکیده

Wireless Internet of Things (IoT) is widely accepted in data collection and transmission power system, with the prerequisite that base station wireless IoT be compatible a variety digital modulation types to meet requirements terminals different modes. As key technology communication, Automatic Modulation Classification (AMC) manages resource shortage improves spectrum utilization efficiency. And for better accuracy efficiency classification signal modulation, Deep learning (DL) frequently exploited. It found real cases signal-to-noise ratio (SNR) signals received by remains low due complex electromagnetic interference from equipment, increasing difficulties accurate AMC. Therefore, inspired attention mechanism multi-layer perceptron (MLP), AMC-MLP introduced herein as novel AMC method SNR signals. Firstly, sampled I/Q converted constellation diagram, smoothed pseudo Wigner-Ville distribution (SPWVD), contour diagram spectral correlation function (SCF). Secondly, convolution auto-encoder (Conv-AE) used denoise extract image feature vectors. Finally, MLP employed fuse multimodal features classify model utilizes characterization advantages images modes boosts Results simulations on RadioML 2016.10A public dataset prove well provides significantly range than other latest deep-learning methods.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Automatic Modulation Classification Based on Deep Learning for Unmanned Aerial Vehicles

Deep learning has recently attracted much attention due to its excellent performance in processing audio, image, and video data. However, few studies are devoted to the field of automatic modulation classification (AMC). It is one of the most well-known research topics in communication signal recognition and remains challenging for traditional methods due to complex disturbance from other sourc...

متن کامل

Multimodal deep learning for solar radio burst classification

In this paper, multimodal deep learning for solar radio burst classification is proposed. We make the first attempt to build multimodal learning network to learn the joint representation of the solar radio spectrums captured from different frequency channels, which are treated as different modalities. In order to learn the representation of each modality and the correlation and interaction betw...

متن کامل

Deep Features for Multimodal Emotion Classification

Understanding human emotion when perceiving audio-visual content is an exciting and important research avenue. Thus, there have been emerging attempts to predict the emotion elicited by video clips or movies recently. While most existing approaches focus either on single modality, i.e., only audio or visual data is exploited, or build on a multimodal scheme with late fusion, we propose a multim...

متن کامل

Multimodal Deep Learning Library

The Neural Network is a directed graph consists of multiple layers of neurons, which is also referred to as units. In general there is no connection between units of the same layer and there are only connections between adjacent layers. The first layer is the input and is referred to as visible layer v. Above the visible layer there are multiple hidden layers {h1, h2, ..., hn}. And the output o...

متن کامل

Multimodal Deep Learning

Deep networks have been successfully applied to unsupervised feature learning for single modalities (e.g., text, images or audio). In this work, we propose a novel application of deep networks to learn features over multiple modalities. We present a series of tasks for multimodal learning and show how to train deep networks that learn features to address these tasks. In particular, we demonstra...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Frontiers in Energy Research

سال: 2022

ISSN: ['2296-598X']

DOI: https://doi.org/10.3389/fenrg.2022.1041862