نتایج جستجو برای: feedforward neural network

تعداد نتایج: 834601  

1997
Cameron N. Riviere Pradeep K. Khosla

We present a novel application of a neural network to augment manual precision by canceling involuntary motion. This method may be applied in microsurgery, using either a telerobotic approach or active compensation in a handheld instrument. A feedforward neural network is trained to input the measured trajectory of a handheld tool tip and output the intended trajectory. Use of the neural networ...

Journal: :Digital Signal Processing 2007
Murat Hüsnü Sazli Can Isik

In this paper, we first show that the BCJR algorithm (or Bahl algorithm) can be implemented via some matrix manipulations. As a direct result of this, we also show that this algorithm is equivalent to a feedforward neural network structure. We verified through computer simulations that this novel neural network implementation yields identical results with the BCJR algorithm.  2006 Elsevier Inc...

Journal: :International Journal of Multimedia and Ubiquitous Engineering 2014

Journal: :Acoustics Australia 2022

Abstract The Helmholtz equation has been used for modeling the sound pressure field under a harmonic load. Computing fields by means of solving can quickly become unfeasible if one wants to study many different geometries ranges frequencies. We propose machine learning approach, namely feedforward dense neural network, computing average over frequency range. data are generated with finite eleme...

2011
Klaus Neumann Jochen J. Steil

Extreme learning machines are single-hidden layer feed-forward neural networks, where the training is restricted to the output weights in order to achieve fast learning with good performance. The success of learning strongly depends on the random parameter initialization. To overcome the problem of unsuited initialization ranges, a novel and efficient pretraining method to adapt extreme learnin...

2000
Adriana Dumitras Faouzi Kossentini

We propose a method for high–order image subsampling using feedforward artificial neural networks (FANNs). Our method employs a tridiagonally symmetrical FANN, which is obtained by applying the design algorithm proposed in [1], and a small, fully connected FANN. We show that our subsampling method provides excellent speed–performance tradeoffs as compared to those of the method based exclusivel...

Journal: :Neurocomputing 1998
Leonardo Franco Sergio A. Cannas

We design new feed-forward multi-layered neural networks which perform di erent elementary arithmetic operations, such as bit shifting, addition of N p-bit numbers, and multiplication of two n-bit numbers. All the structures are optimal in depth and are polinomialy bounded in the number of neurons and in the number of synapses. The whole set of synaptic couplings and thresholds are obtained exa...

2001
Mikael Bodén

This paper provides guidance to some of the concepts surrounding recurrent neural networks. Contrary to feedforward networks, recurrent networks can be sensitive, and be adapted to past inputs. Backpropagation learning is described for feedforward networks, adapted to suit our (probabilistic) modeling needs, and extended to cover recurrent networks. The aim of this brief paper is to set the sce...

1995
Christian Goerick

We propose a characteristic structure number interrelating the weight vectors of a feed-forward neural network. It allows the monitoring of the learning process of feedforward neural networks and the identiication of characteristic points/phases during the learning process. Some properties are given and results of applications to diierent networks are shown.

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید