So long, WordPress, and thanks for all the spam
Statisfaction
by Nicolas Chopin
8M ago
Please update your bookmarks, as this blog has just moved to a new home, here, on github. There is also a new RSS feed, so you may want to update your RSS reader as well ..read more
Visit website
New smoothing algorithms in particles
Statisfaction
by Nicolas Chopin
1y ago
Hi, just a quick post to announce that particles now implements several of the smoothing algorithms introduced in our recent paper with Dang on the complexity of smoothing algorithms. Here is a plot that compares their running time for a given number of particles: All these algorithms are based on FFBS (forward filtering backward smoothing). The first two are not new. O(N^2) FFBS is the classical FFBS algorithm, which as complexity O(N^2). FFBS-reject uses (pure) rejection to choose the ancestors in the backward step. In our paper, we explain that the running time of FFBS-reject is random, an ..read more
Visit website
Particles 0.3: waste-free SMC, Fortran dependency removed, binary spaces
Statisfaction
by Nicolas Chopin
2y ago
I have just released version 0.3 of particles (my Python Sequential Monte Carlo library). Here are the main changes: No more Fortran dependency Previous versions of particles relied on a bit of Fortran code to produce QMC (quasi-Monte Carlo) points. This code was automatically compiled during the installation. This was working fine for most users, but not all, unfortunately. The latest (1.7) version of Scipy includes a stats.qmc sub-module. Particles 0.3 relies on this sub-module to generate QMC points, and thus is a pure Python package. This should mean fewer headaches when installing particl ..read more
Visit website
Online workshop: Measuring the quality of MCMC output
Statisfaction
by Pierre Jacob
3y ago
Yes, we need a better logo. Hi all, With Leah South from QUT we are organizing an online workshop on the topic of “Measuring the quality of MCMC output”. The event website is here with more info: https://bayescomp-isba.github.io/measuringquality.html This is part of ISBA BayesComp section’s efforts to organize activities while waiting for the next “big” in-person meeting, hopefully in 2023. The event benefits from the generous support of QUT Centre for Data Science. The event’s website will be regularly updated between now and the event in October 2021, with three live sessions: 11am-2pm UTC ..read more
Visit website
Dempster’s analysis and donkeys
Statisfaction
by Pierre Jacob
3y ago
This post is about estimating the parameter of a Bernoulli distribution from observations, in the “Dempster” or “Dempster–Shafer” way, which is a generalization of Bayesian inference. I’ll recall what this approach is about, and describe a Gibbs sampler to perform the computation. Intriguingly the associated Markov chain happens to be equivalent to the so-called “donkey walk” (not this one), as pointed out by Guanyang Wang and Persi Diaconis. Denote the observations, or “coin flips”, by . The model stipulates that , where are independent Uniform(0,1) variables, and is the parameter to be es ..read more
Visit website
Particles 0.2: what’s new, what’s next (your comments most welcome)
Statisfaction
by nicolaschopin
3y ago
I have just released version 0.2 of my SMC python library, particles. I list below the main changes, and discuss some ideas for the future of the library. New module: variance_estimators This module implements various variance estimators that may be computed from a single run of an SMC algorithm, à la Chan and Lai (2013) and Lee and Whiteley (2018). For more details, see this notebook. New module: datasets This module makes it easier to load the datasets included in the module. Here is a quick example: from particles import datasets as dts dataset = dts.Pima() help(dataset) # basic info on da ..read more
Visit website
On the benefits of reviewing papers
Statisfaction
by Julyan Arbel
3y ago
Would you have agreed to review this paper if you had been asked? When I’m asked by students whether they should accept some referee invitation (being it for a stat journal or a machine learning conference) I almost invariably say yes. I think that there is a lot to be learnt when refereeing papers and that this worth the time spent in the process. I’ll detail in this post why I think so. First, this post is not about tips on how to write a referee report, but rather on why. It is instructive to consult tips on the hows, and good posts can be found out there. Note that some journals will also ..read more
Visit website
Post-doc position
Statisfaction
by nicolaschopin
3y ago
Andras Fulop, Jeremy Heng (both ESSEC), and me (Nicolas Chopin, ENSAE, IPP) are currently advertising a post-doc position to work on developing SMC methods for challenging models found in Finance and Econometrics. If you are interested, click here for more details, and get in touch with us ..read more
Visit website
Everything You Always Wanted to Know About SMC, but were afraid to ask
Statisfaction
by nicolaschopin
3y ago
Ever wanted to learn more about particle filters, sequential Monte Carlo, state-space/hidden Markov models, PMCMC (particle MCMC) , SMC samplers, and related topics? In that case, you might want to check the following book from Omiros Papaspiliopoulos and I, which has just been released by Springer: and which may be ordered from their web-site, or from your favourite book store. The aim of the book is to cover the many facets of SMC: the algorithms, their practical uses in different areas, the underlying theory, how they may be implemented in practice, etc. Each chapter contains a “Python co ..read more
Visit website
Categorical distribution, structure of the second kind and Gumbel-max trick
Statisfaction
by Pierre Jacob
4y ago
Hi all, This post is about a way of sampling from a Categorical distribution, which appears in Arthur Dempter‘s approach to inference as a generalization of Bayesian inference (see Figure 1 in “A Generalization of Bayesian Inference”, 1968), under the name “structure of the second kind”. It’s the starting point of my on-going work with Ruobin Gong and Paul Edlefsen, which I’ll write about on another day. This sampling mechanism turns out to be strictly equivalent to the “Gumbel-max” trick that got some attention in machine learning see e.g. this blog post by Francis Bach. Let’s look at the fi ..read more
Visit website

Follow Statisfaction on FeedSpot

Continue with Google
Continue with Apple
OR