نتایج جستجو برای: hidden training
تعداد نتایج: 378572 فیلتر نتایج به سال:
Evolving Cascade Neural Networks (ECNNs) and a new training algorithm capable of selecting informative features are described. The ECNN initially learns with one input node and then evolves by adding new inputs as well as new hidden neurons. The resultant ECNN has a near minimal number of hidden neurons and inputs. The algorithm is successfully used for training ECNN to recognise artefacts in s...
Abstract Erosion and sedimentation are the most complicated problems in hydrodynamic which are very important in water-related projects of arid and semi-arid basins. For this reason, the presence of suitable methods for good estimation of suspended sediment load of rivers is very valuable. Solving hydrodynamic equations related to these phenomenons and access to a mathematical-conceptual mode...
This article presents a method for training Dynamic Factor Graphs (DFG) with continuous latent state variables. A DFG includes factors modeling joint probabilities between hidden and observed variables, and factors modeling dynamical constraints on hidden variables. The DFG assigns a scalar energy to each configuration of hidden and observed variables. A gradient-based inference procedure finds...
This document shows the instances that were used for the CHeSC 2011 competition. These instances are available within the JAR file containing the HyFlex software framework[2] version used for the competition. The first four domains were released before the competition as training domains with 10 instances each. There are now 12 instances in each because we added two hidden instances for the com...
This work proposes a new architecture for deep neural network training. Instead of having one cascade of fully connected hidden layers between the input features and the target output, the new architecture organizes hidden layers into several regions with each region having its own target. Regions communicate with each other during the training process by connections among intermediate hidden l...
The present study focuses on the discussion over the parameter of Artificial Neural Network (ANN). Sensitivity analysis is applied to assess the effect of the parameters of ANN on the prediction of turbidity of raw water in the water treatment plant. The result shows that transfer function of hidden layer is a critical parameter of ANN. When the transfer function changes, the reliability of pre...
It is possible to learn multiple layers of non-linear features by backpropagating error derivatives through a feedforward neural network. This is a very effective learning procedure when there is a huge amount of labeled training data, but for many learning tasks very few labeled examples are available. In an effort to overcome the need for labeled data, several different generative models were...
In this paper we propose a Shared Hidden Layer Multisoftmax Deep Neural Network (SHL-MDNN) approach for semi-supervised training (SST). This approach aims to boost low-resource speech recognition where limited training data is available. Supervised data and unsupervised data share the same hidden layers but are fed into different softmax layers so that erroneous automatic speech recognition (AS...
We present preliminary results of experiments with two types of recurrent neural networks for a natural language learning task. The neural networks, Elman networks and Recurrent Cascade Correlation (RCC), were trained on the text of a first-year primary school reader. The networks performed a one-step-look-ahead task, i.e. they had to predict the lexical category of the next following word. Elm...
For high dimensional pattern recognition problems, the learning speed of gradient based training algorithms (back-propagation) is generally very slow. Local minimum, improper learning rate and over-fitting are some of the other issues. Extreme learning machine was proposed as a non-iterative learning algorithm for single-hidden layer feed forward neural network (SLFN) to overcome these issues. ...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید