So I decided to post a pic from the Metropolis movie here to grab your interest. Hope it worked. Did it? Hope so. But I dunno. Actually, never cared for that movie but then again, what do I know about fine cinema? Like I thought Anna Faris was robbed of an Oscar for her work in House Bunny and the cute, sleeping baby in Ill Manors was robbed for being, well, a cute, sleeping baby. Again, what do I know? I dunno. But onto today’s topic of the Metropolis-Hastings algorithm. So this is a MCMC method (yes, MCMC like we covered here), where we sample values and decide to accept or reject those values depending on whether or not a probability is less than the a ratio of the distributions we are interested in. And the probability used in this comparison can itself be drawn from a uniform distribution between 0 and 1 (so that any value between those two numbers can be equally drawn). Now you’re lost, aren’t you? That’s okay. Let’s move on to the example then. That should help you. And me too as I’m kind of lost by what I just wrote. Wait … what? Um … you didn’t hear that, okay?
So anyway, say we consider Maggie Upton/Zelov again and she just enters the children’s ward in one dimension where she’s in Atlanta. But as she opens the door, there is an intersection of dimensions and she just entered a dimension where she again is a kid doctor but in say, St. Louis instead. Or in Detroit. Or in Indianapolis. Or just remains in the realm in Atlanta. So we could give any of those possibilities an equal chance of happening. So we could accept any of those events as happening based on a probability that we randomly draw. By the way, an interaction of dimensions is described in my book as accompanied with a humming noise and a flash of lightning and a whole big thing — you’ll just have to read the books. But moving on — say she grows through the door and comes out from a luxurious Hawaiian place as Tina, another character in my book. Now, as nice as that would sound for Maggie, we would reject that possibility as the ratio involving the probabilities of turning into another character in another dimension is zero. What? Why? Because that’s how I wrote my story, that’s why! Because I said so. Look, if you want to write a book where one character can change into another character, go right ahead, but that’s not how I wrote my story. And if you check it out, you’ll know why. But anyway, that’s roughly what the Metropolis-Hastings algorithm is. I hope. I sent a link to this blog to my comp stat/Bayesian professor/dissertation advisor so if something’s not right here that he reads, I’m sure he’ll let me know. But anyway, that’s all for now but join me next time when I cover the Gibbs sampler, which is a specialized case of the Metropolis-Hastings algorithm. Or is the Metropolis-Hastings algorithm a generalized case of Gibbs or … well, we’ll talk about that next time.