<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>VBF Experiments, April 2026 on mlbot.blog</title><link>https://mlbot.blog/series/vbf-experiments-april-2026/</link><description>Recent content in VBF Experiments, April 2026 on mlbot.blog</description><generator>Hugo</generator><language>en-US</language><lastBuildDate>Thu, 30 Apr 2026 15:16:48 +0530</lastBuildDate><atom:link href="https://mlbot.blog/series/vbf-experiments-april-2026/index.xml" rel="self" type="application/rss+xml"/><item><title>A Tour Of Learned And Reference-Free Bayesian Filters</title><link>https://mlbot.blog/posts/bayesian-filtering-techniques-tour/</link><pubDate>Thu, 30 Apr 2026 14:03:11 +0530</pubDate><guid>https://mlbot.blog/posts/bayesian-filtering-techniques-tour/</guid><description>&lt;p&gt;This is a long technical note about a small research program in Bayesian filtering.
The starting point is familiar if you know Kalman filters: there is a hidden state,
there are noisy measurements, and the filter has to update its belief online.&lt;/p&gt;</description></item><item><title>Reference-Free Quadrature Filters For The Sine Benchmark</title><link>https://mlbot.blog/posts/quadrature-power-ep-filtering/</link><pubDate>Thu, 30 Apr 2026 13:18:00 +0530</pubDate><guid>https://mlbot.blog/posts/quadrature-power-ep-filtering/</guid><description>&lt;p&gt;The last part of the week stepped away from amortized training and asked a sharper question:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;How far can a deterministic, reference-free filtering update go if it directly integrates the known nonlinear likelihood and projects the result back to a strict family?&lt;/p&gt;</description></item><item><title>When Mixtures Beat Local ELBO In Nonlinear Filtering</title><link>https://mlbot.blog/posts/mixture-iwae-filtering-branch/</link><pubDate>Thu, 30 Apr 2026 13:17:00 +0530</pubDate><guid>https://mlbot.blog/posts/mixture-iwae-filtering-branch/</guid><description>&lt;p&gt;The nonlinear objective-repair branch left a clear gap: the best fully unsupervised strict Gaussian filter improved robustness, but still had weak coverage and very low variance ratios. The next branch tested whether that was an objective problem, a posterior-family problem, or a coupled problem.&lt;/p&gt;</description></item><item><title>Repairing a Nonlinear Strict Filter Without Reference Targets</title><link>https://mlbot.blog/posts/nonlinear-strict-filter-objective-repair/</link><pubDate>Thu, 30 Apr 2026 13:16:00 +0530</pubDate><guid>https://mlbot.blog/posts/nonlinear-strict-filter-objective-repair/</guid><description>&lt;p&gt;After the scalar benchmark, the work moved to a nonlinear sine-observation model:&lt;/p&gt;
\[
z_t = z_{t-1} + w_t,\quad w_t \sim \mathcal{N}(0,Q)
\]\[
y_t = x_t \sin(z_t) + v_t,\quad v_t \sim \mathcal{N}(0,R)
\]&lt;p&gt;The strict filtering contract stayed the same:&lt;/p&gt;
\[
q^F_t = \operatorname{update}(q^F_{t-1}, x_t, y_t)
\]&lt;p&gt;No hidden sequence state was allowed in the headline rows. The filter had to export an explicit online filtering marginal at each time step.&lt;/p&gt;</description></item><item><title>Variational Filtering, Rebuilt From the Linear Case</title><link>https://mlbot.blog/posts/vbf-linear-gaussian-calibration/</link><pubDate>Thu, 30 Apr 2026 13:15:00 +0530</pubDate><guid>https://mlbot.blog/posts/vbf-linear-gaussian-calibration/</guid><description>&lt;p&gt;This week started by rebuilding the variational Bayesian filtering experiments around a scalar linear-Gaussian state-space model. The goal was not to win on a toy problem. The goal was to make the mechanics testable before asking nonlinear questions:&lt;/p&gt;
\[
z_t = z_{t-1} + w_t,\quad w_t \sim \mathcal{N}(0, Q)
\]\[
y_t = x_t z_t + v_t,\quad v_t \sim \mathcal{N}(0, R)
\]&lt;p&gt;The filter carried a strict online marginal \(q^F_t(z_t)\) plus an edge/backward conditional \(q^B_t(z_{t-1} \mid z_t)\). That made the posterior edge factor explicit:&lt;/p&gt;</description></item></channel></rss>