An Efficient Asymmetric Nonlinear Activation Function for Deep Neural Networks
نویسندگان
چکیده
As a key step to endow the neural network with nonlinear factors, activation function is crucial performance of network. This paper proposes an Efficient Asymmetric Nonlinear Activation Function (EANAF) for deep networks. Compared existing functions, proposed EANAF requires less computational effort, and it self-regularized, asymmetric non-monotonic. These desired characteristics facilitate outstanding EANAF. To demonstrate effectiveness this in field object detection, compared several state-of-the-art functions on typical backbone networks such as ResNet DSPDarkNet. The experimental results superior
منابع مشابه
Reluplex: An Efficient SMT Solver for Verifying Deep Neural Networks
Deep neural networks have emerged as a widely used and effective means for tackling complex, real-world problems. However, a major obstacle in applying them to safety-critical systems is the great difficulty in providing formal guarantees about their behavior. We present a novel, scalable, and efficient technique for verifying properties of deep neural networks (or providing counter-examples). ...
متن کاملNonparametric regression using deep neural networks with ReLU activation function
Consider the multivariate nonparametric regression model. It is shown that estimators based on sparsely connected deep neural networks with ReLU activation function and properly chosen network architecture achieve the minimax rates of convergence (up to log n-factors) under a general composition assumption on the regression function. The framework includes many well-studied structural constrain...
متن کاملEfficient Model Averaging for Deep Neural Networks
Large neural networks trained on small datasets are increasingly prone to overfitting. Traditional machine learning methods can reduce overfitting by employing bagging or boosting to train several diverse models. For large neural networks, however, this is prohibitively expensive. To address this issue, we propose a method to leverage the benefits of ensembles without explicitely training sever...
متن کاملActivation Ensembles for Deep Neural Networks
Many activation functions have been proposed in the past, but selecting an adequate one requires trial and error. We propose a new methodology of designing activation functions within a neural network at each layer. We call this technique an “activation ensemble” because it allows the use of multiple activation functions at each layer. This is done by introducing additional variables, α, at eac...
متن کاملWhy Deep Neural Networks for Function Approximation?
Recently there has been much interest in understanding why deep neural networks are preferred to shallow networks. We show that, for a large class of piecewise smooth functions, the number of neurons needed by a shallow network to approximate a function is exponentially larger than the corresponding number of neurons needed by a deep network for a given degree of function approximation. First, ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Symmetry
سال: 2022
ISSN: ['0865-4824', '2226-1877']
DOI: https://doi.org/10.3390/sym14051027