Item Infomation
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Michael C., Burkhart | - |
dc.date.accessioned | 2023-04-03T08:16:22Z | - |
dc.date.available | 2023-04-03T08:16:22Z | - |
dc.date.issued | 2022 | - |
dc.identifier.uri | https://link.springer.com/article/10.1007/s11590-022-01895-5 | - |
dc.identifier.uri | https://dlib.phenikaa-uni.edu.vn/handle/PNK/7448 | - |
dc.description | CC BY | vi |
dc.description.abstract | To minimize the average of a set of log-convex functions, the stochastic Newton method iteratively updates its estimate using subsampled versions of the full objective’s gradient and Hessian. We contextualize this optimization problem as sequential Bayesian inference on a latent state-space model with a discriminatively-specified observation process. Applying Bayesian filtering then yields a novel optimization algorithm that considers the entire history of gradients and Hessians when forming an update. | vi |
dc.language.iso | en | vi |
dc.publisher | Springer | vi |
dc.subject | log-convex functions, | vi |
dc.subject | latent state-space model | vi |
dc.title | Discriminative Bayesian filtering lends momentum to the stochastic Newton method for minimizing log-convex functions | vi |
dc.type | Book | vi |
Appears in Collections | ||
OER - Khoa học Tự nhiên |
Files in This Item: