Anda belum login :: 07 Jun 2025 12:18 WIB
Detail
ArtikelWorst-Case Quadratic Loss Bounds for Prediction Using Linear Functions and Gradient Descent  
Oleh: Cesa-Bianchi, N. ; Long, P. M. ; Warmuth, M. K.
Jenis: Article from Journal - ilmiah internasional
Dalam koleksi: IEEE Transactions on Neural Networks vol. 7 no. 3 (1996), page 604-619.
Topik: Temperature Gradient; worst - case; quadratic; prediction; linear functions; gradient descent
Ketersediaan
  • Perpustakaan Pusat (Semanggi)
    • Nomor Panggil: II36.1
    • Non-tandon: 1 (dapat dipinjam: 0)
    • Tandon: tidak ada
    Lihat Detail Induk
Isi artikelStudies the performance of gradient descent (GD) when applied to the problem of online linear prediction in arbitrary inner product spaces. We prove worst - case bounds on the sum of the squared prediction errors under various assumptions concerning the amount of a priori information about the sequence to predict. The algorithms we use are variants and extensions of online GD. Whereas our algorithms always predict using linear functions as hypotheses, none of our results requires the data to be linearly related. In fact, the bounds proved on the total prediction loss are typically expressed as a function of the total loss of the best fixed linear predictor with bounded norm. All the upper bounds are tight to within constants. Matching lower bounds are provided in some cases. Finally, we apply our results to the problem of online prediction for classes of smooth functions.
Opini AndaKlik untuk menuliskan opini Anda tentang koleksi ini!

Kembali
design
 
Process time: 0 second(s)