Computational Learning Theory
Editat de Shai Ben-Daviden Limba Engleză Paperback – 3 mar 1997
The book presents 25 revised full papers carefully selected from a total of 36 high-quality submissions. The volume spans the whole spectrum of computational learning theory, with a certain emphasis on mathematical models of machine learning. Among the topics addressed are machine learning, neural nets, statistics, inductive inference, computational complexity, information theory, and theoretical physics.
Preț: 323.62 lei
Preț vechi: 404.53 lei
-20% Nou
Puncte Express: 485
Preț estimativ în valută:
57.26€ • 67.24$ • 50.26£
57.26€ • 67.24$ • 50.26£
Carte tipărită la comandă
Livrare economică 27 ianuarie-10 februarie 26
Preluare comenzi: 021 569.72.76
Specificații
ISBN-13: 9783540626855
ISBN-10: 3540626859
Pagini: 348
Ilustrații: CCCXLVIII, 338 p.
Dimensiuni: 155 x 235 x 19 mm
Greutate: 0.53 kg
Ediția:1997
Editura: Springer
Locul publicării:Berlin, Heidelberg, Germany
ISBN-10: 3540626859
Pagini: 348
Ilustrații: CCCXLVIII, 338 p.
Dimensiuni: 155 x 235 x 19 mm
Greutate: 0.53 kg
Ediția:1997
Editura: Springer
Locul publicării:Berlin, Heidelberg, Germany
Public țintă
ResearchCuprins
Sample compression, learnability, and the Vapnik-Chervonenkis dimension.- Learning boxes in high dimension.- Learning monotone term decision lists.- Learning matrix functions over rings.- Learning from incomplete boundary queries using split graphs and hypergraphs.- Generalization of the PAC-model for learning with partial information.- Monotonic and dual-monotonic probabilistic language learning of indexed families with high probability.- Closedness properties in team learning of recursive functions.- Structural measures for games and process control in the branch learning model.- Learning under persistent drift.- Randomized hypotheses and minimum disagreement hypotheses for learning with noise.- Learning when to trust which experts.- On learning branching programs and small depth circuits.- Learning nearly monotone k-term DNF.- Optimal attribute-efficient learning of disjunction, parity, and threshold functions.- learning pattern languages using queries.- On fast and simple algorithms for finding Maximal subarrays and applications in learning theory.- A minimax lower bound for empirical quantizer design.- Vapnik-Chervonenkis dimension of recurrent neural networks.- Linear Algebraic proofs of VC-Dimension based inequalities.- A result relating convex n-widths to covering numbers with some applications to neural networks.- Confidence estimates of classification accuracy on new examples.- Learning formulae from elementary facts.- Control structures in hypothesis spaces: The influence on learning.- Ordinal mind change complexity of language identification.- Robust learning with infinite additional information.