CLAHRCs are concerned with improving care, but many initiatives fail. This article seeks a method to predict success or failure in advance. The CLAHRC WM Director recently came across a pair of papers that used Bayes theorem to predict successful organisational change interventions. 276 more words

## Tags » Bayesian

#### Responses to the BASP psychologists' p-value ban: the one that got away

When StatsLife collected responses to the BASP p-value ban (see blogs hither and yon), I suggested they contact Ian Hunt, a wise and philosophically minded critical voice in the wilderness of cookbook analysts. 416 more words

#### MCMC in Python: Gaussian mixture model in PyMC3

PyMC3 is really coming along. I tried it out on a Gaussian mixture model that was the subject of some discussion on GitHub: https://github.com/pymc-devs/pymc3/issues/443#issuecomment-109813012 http://nbviewer.ipython.org/gist/aflaxman/64f22d07256f67396d3a

#### Introducing StataStan

I have been working on a Stata add-on command to fit Bayesian models using Stan, and this is now out for testing. In this post, I want to introduce it, explain why it’s important, and encourage you all to try it out and give your feedback. 1,035 more words

#### Laplace approximation in Python: another cool trick with PyMC3

I admit that I’ve been skeptical of the complete rewrite of PyMC that underlies version 3. It seemed to me motivated by an interest in using unproven new step methods that require knowing the derivative of the posterior distribution. 66 more words

#### Objectivity in Service Delivery Research

The CLAHRC WM Director gave two recent talks about methodology and causal modelling in the evaluation of service delivery / quality improvement initiatives.

In both he received push back from a section of the audience. 832 more words

#### Population Growth Estimation via Hamiltonian Monte Carlo

Here’s the same analysis of estimating population growth using Stan.

```
data {
int<lower=0> N; // number of observations
vector[N] y; // observed population
}
parameters {
real r;
}
model {
real k;
real p0;
real deltaT;
real sigma;
real mu0;
real sigma0;
vector[N] p;
k <- 1.0;
p0 <- 0.1;
deltaT <- 0.0005;
sigma <- 0.01;
mu0 <- 5;
sigma0 <- 10;
r ~ normal(mu0, sigma0);
for (n in 1:N) {
p[n] <- k * p0 * exp((n - 1) * r * deltaT) / (k + p0 * (exp((n - 1) * r * deltaT) - 1));
y[n] ~ normal(p[n], sigma);
}
}
```

20 more words