Jochen's Bloghttp://www.seehuhn.de/blog/2015-04-17T22:07:39+00:00Jochen Vossvoss@seehuhn.dehttp://www.seehuhn.de/Fast-dm version 30.2 released
2015-04-17T22:07:39+00:00http://www.seehuhn.de/blog/139<p>I am happy to announce the release of fast-dm version 30.2. This
release fixes some memory leaks found by
<a href="http://valgrind.org/">Valgrind</a>. You can download the program
from the <a href="/pages/fast-dm">Fast-dm homepage</a>.
"new" paper
2015-02-06T14:32:41+00:00http://www.seehuhn.de/blog/138<p>A new paper, together with my PhD student Mark Webster and my colleague
Stuart Barber, has been published today!
<ul class="compact">
<li><span itemprop="author">S. Barber</span>, <span itemprop="author">J. Voss</span> and <span itemprop="author">M. Webster</span>:
<span class="it"><span itemprop="name">The Rate of Convergence for Approximate Bayesian Computation</span></span>.
Electronic Journal of Statistics, vol. 9, pp. 80–105, 2015.
<br><span class="links"><a href="http://dx.doi.org/10.1214/15-EJS988">DOI:10.1214/15-EJS988</a>, <a href="http://arxiv.org/abs/1311.2038">arXiv:1311.2038</a>, <a href="/publications/BaVoWe13">more…</a></span>
</ul>
<p>This paper took a long time to be published, but I think it is worth
the wait.
slow article
2014-03-04T15:46:28+00:00http://www.seehuhn.de/blog/137<p>I am very happy to announce that a very old (nearly forgotten?) article
has finally appeared. We chose the journal because we had heard the rumour
that they have a fast turnaround, but this plan didn't work out: the
article was submitted 23 September 2011 and accepted on 21 February 2012.
But then came a two year wait, and the article only has appeared just now:
<ul>
<li><span itemprop="author">K.V. Mardia</span> and <span itemprop="author">J. Voss</span>:
<span class="it"><span itemprop="name">Some Fundamental Properties of a Multivariate von Mises Distribution</span></span>.
Communications in Statistics — Theory and Methods, vol. 43,
pp. 1132–1144, 2014.
<br><span class="links"><a href="http://dx.doi.org/10.1080/03610926.2012.670353">DOI:10.1080/03610926.2012.670353</a>, <a href="http://arxiv.org/abs/1109.6042">arXiv:1109.6042</a>, <a href="/publications/MaVo14">more…</a></span>
</ul>
New PhD position
2013-11-22T17:00:37+00:00http://www.seehuhn.de/blog/136<p>Do you want to do a PhD on <q>Bayesian Uncertainty Quantification</q>
with me? Are you good with maths and computers? As part of the new,
<a href="http://en.wikipedia.org/wiki/Natural_Environment_Research_Council">NERC funded</a>,
doctoral training centre I am advertising for a PhD position. Details can
be found on the
<a href="http://www.nercdtp.leeds.ac.uk/projects/index.php?id=108">DTP web page</a>,
the deadline for applications is 24 January 2014, start of the PhD is
October 2014.
The Rate of Convergence for Approximate Bayesian Computation
2013-11-08T23:20:39+00:00http://www.seehuhn.de/blog/135<p>I am very happy to report that after a long struggle we have finally
finished (and submitted) a new paper today:
<ul class="compact">
<p><span itemprop="author">S. Barber</span>, <span itemprop="author">J. Voss</span> and <span itemprop="author">M. Webster</span>:
<span class="it"><span itemprop="name">The Rate of Convergence for Approximate Bayesian Computation</span></span>.
Electronic Journal of Statistics, vol. 9, pp. 80–105, 2015.
<br><span class="links"><a href="http://dx.doi.org/10.1214/15-EJS988">DOI:10.1214/15-EJS988</a>, <a href="http://arxiv.org/abs/1311.2038">arXiv:1311.2038</a>, <a href="/publications/BaVoWe13">more…</a></span>
</ul>
<p>The paper studies the convergence of
<a href="http://en.wikipedia.org/wiki/Approximate_Bayesian_computation">ABC estimates</a>
to the correct value. In particular,
<ul class="compact">
<li>we prove that the method converges under very weak conditions,
<li>we show that the tolerance parameter δ should be chosen
proportional to <i>n</i><sup>-1/4</sup>, where <i>n</i> is the number of
accepted ABC samples, and
<li>we determine the speed of convergence under optimal choice
of δ.
</ul>
<p>Hopefully, these results prove useful for others, too.