Feed-Forward Neural Networks
Autor Jouke Annemaen Limba Engleză Hardback – 31 mai 1995
Starting with the derivation of a specification and ending with its hardware implementation, analog hard-wired, feed-forward neural networks with on-chip back-propagation learning are designed in their entirety. On-chip learning is necessary in circumstances where fixed weight configurations cannot be used. It is also useful for the elimination of most mis-matches and parameter tolerances that occur in hard-wired neural network chips.
Fully analog neural networks have several advantages over other implementations: low chip area, low power consumption, and high speed operation.
Feed-Forward Neural Networks is an excellent source of reference and may be used as a text for advanced courses.
| Toate formatele și edițiile | Preț | Express |
|---|---|---|
| Paperback (1) | 613.80 lei 6-8 săpt. | |
| Springer Us – 13 iul 2013 | 613.80 lei 6-8 săpt. | |
| Hardback (1) | 622.30 lei 6-8 săpt. | |
| Springer Us – 31 mai 1995 | 622.30 lei 6-8 săpt. |
Preț: 622.30 lei
Preț vechi: 732.12 lei
-15%
Puncte Express: 933
Preț estimativ în valută:
110.04€ • 126.19$ • 95.10£
110.04€ • 126.19$ • 95.10£
Carte tipărită la comandă
Livrare economică 27 aprilie-11 mai
Specificații
ISBN-13: 9780792395676
ISBN-10: 0792395670
Pagini: 238
Ilustrații: XIII, 238 p.
Dimensiuni: 160 x 241 x 19 mm
Greutate: 0.55 kg
Ediția:1995 edition
Editura: Springer Us
Locul publicării:New York, NY, United States
ISBN-10: 0792395670
Pagini: 238
Ilustrații: XIII, 238 p.
Dimensiuni: 160 x 241 x 19 mm
Greutate: 0.55 kg
Ediția:1995 edition
Editura: Springer Us
Locul publicării:New York, NY, United States
Public țintă
ResearchCuprins
1 Introduction.- 2 The Vector Decomposition Method.- 3 Dynamics of Single Layer Nets.- 4 Unipolar Input Signals in Single-Layer Feed-Forward Neural Networks.- 5 Cross-talk in Single-Layer Feed-Forward Neural Networks.- 6 Precision Requirements for Analog Weight Adaptation Circuitry for Single-Layer Nets.- 7 Discretization of Weight Adaptations in Single-Layer Nets.- 8 Learning Behavior and Temporary Minima of Two-Layer Neural Networks.- 9 Biases and Unipolar Input signals for Two-Layer Neural Networks.- 10 Cost Functions for Two-Layer Neural Networks.- 11 Some issues for f’ (x).- 12 Feed-forward hardware.- 13 Analog weight adaptation hardware.- 14 Conclusions.- Nomenclature.