en-de
en-es
en-fr
en-pt
en-sl
en
en-zh
0.25
0.5
0.75
1.25
1.5
1.75
2
Concave Gaussian Variational Approximations for Inference in Large-Scale Bayesian Linear Models
Published on May 06, 20113816 Views
Two popular approaches to forming principled bounds in approximate Bayesian inference are local variational methods and minimal Kullback-Leibler divergence methods. For a large class of models, we
Related categories
Chapter list
Concave Gaussian Variational Approximations for Bayesian GLMs00:00
In Brief (1)00:28
In Brief (2)00:33
In Brief (3)00:43
In Brief (4)00:48
In Brief, Our Contribution (1)00:53
In Brief, Our Contribution (2)00:58
In Brief, Our Contribution (3)01:03
In Brief, Our Contribution (4)01:08
In Brief, Our Contribution (5)01:13
Bayesian Linear Models01:28
Bayesian Models01:30
Bayesian Generalised Linear Models01:34
Bayesian Logistic Regression (1)02:29
Bayesian Logistic Regression (2)03:02
Local and Variational Gaussian Bounds03:12
Bounding Z03:18
Local (bound integrand) 03:33
VG (bound integral) 03:48
Relationship between the Local and VG bounds04:01
Local Variational Bounds04:42
Variational Gaussian Bounds05:38
Properties, Known results07:14
Properties, New results07:27
Relating the Local and VG Bounds07:57
Scalability09:30
Concavity of B09:33
log-concave 010:42
VG Optimisation : Full Cholesky11:10
VG Optimisation : Banded Cholesky11:33
VG Optimisation : Chevron Cholesky12:14
Experiments13:00
Bayesian Logistic Regression: realsim13:04
Inverse Modelling (MRI, gene reg. networks etc.)14:11
Inverse Modelling14:49
Summary (1)15:59
Summary (2)16:08
Summary (3)16:13
Thank you!16:30