Block-term tensor neural networks
نویسندگان
چکیده
منابع مشابه
Learning Compact Recurrent Neural Networks with Block-Term Tensor Decomposition
Recurrent Neural Networks (RNNs) are powerful sequence modeling tools. However, when dealing with high dimensional inputs, the training of RNNs becomes computational expensive due to the large number of model parameters. This hinders RNNs from solving many important computer vision tasks, such as Action Recognition in Videos and Image Captioning. To overcome this problem, we propose a compact a...
متن کاملBlock-Decoupling Multivariate Polynomials Using the Tensor Block-Term Decomposition
We present a tensor-based method to decompose a given set of multivariate functions into linear combinations of a set of multivariate functions of linear forms of the input variables. The method proceeds by forming a three-way array (tensor) by stacking Jacobian matrix evaluations of the function behind each other. It is shown that a blockterm decomposition of this tensor provides the necessary...
متن کاملBT-Nets: Simplifying Deep Neural Networks via Block Term Decomposition
Recently, deep neural networks (DNNs) have been regarded as the state-of-the-art classification methods in a wide range of applications, especially in image classification. Despite the success, the huge number of parameters blocks its deployment to situations with light computing resources. Researchers resort to the redundancy in the weights of DNNs and attempt to find how fewer parameters can ...
متن کاملEfficient Short-Term Electricity Load Forecasting Using Recurrent Neural Networks
Short term load forecasting (STLF) plays an important role in the economic and reliable operation ofpower systems. Electric load demand has a complex profile with many multivariable and nonlineardependencies. In this study, recurrent neural network (RNN) architecture is presented for STLF. Theproposed model is capable of forecasting next 24-hour load profile. The main feature in this networkis ...
متن کاملBlock-based neural networks
This paper presents a novel block-based neural network (BBNN) model and the optimization of its structure and weights based on a genetic algorithm. The architecture of the BBNN consists of a 2D array of fundamental blocks with four variable input/output nodes and connection weights. Each block can have one of four different internal configurations depending on the structure settings, The BBNN m...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Neural Networks
سال: 2020
ISSN: 0893-6080
DOI: 10.1016/j.neunet.2020.05.034