During my Ph.D., I have worked on the analysis of three Markov chains on discrete state spaces. (1) is a puplished journal paper, I expect paper (2) to submitted in May 2024, and paper (3) is in preparation.

  1. Mixing time of a non-reversible Markov chain on a hypercube.

    In this project, we present a non-reversible Markov chain on the n-dimensional hypercube {0,1}^n which converges in order n steps. Previously known Markov chains, which are reversible, converge in n log n steps. This Markov chain alternates between random and deterministic moves, and we prove that the chain also has a phase transition. The deterministic moves correspond to a linear shift register.

  2. Mixing time of a non-reversible Markov chain on a random graph.

    We present a non-reversible Markov chain on the cycle Z_n with additional random edges and show that it converges in order log n steps with high probability. The random walk is shown to locally be approximated by an auxiliary process, and when this auxiliary process reaches an appropriate entropic threshold, the original Markov chain converges. Most of the previous examples are for reversible chains. One goal of this paper is to show that this idea works also for a non-reversible chain. The other goal is to provide another context where the convergence behavior of a Markov chain can be accelerated by the addition of additional (random) allowable transitions.

  3. Mixing time of Hamiltonian Monte-Carlo Markov chain for the Ising model.

    We present a Hamiltonian Monte-Carlo Markov chain to sample from the Ising model on a complete graph of n vertices, and show that it converges in O(n log n) steps at high temperature. Hamiltonian Monte-Carlo Markov chains are most commonly applied for continuous state spaces and have only recently been rigorously analyzed. One goal of this paper is to apply this Markov chain in a continuous space for sampling from a well-studied discrete probability distribution, namely the Ising model, using a projection. It is worth emphasizing that the projection of the Markov chain onto the discrete space is not a Markov chain. We prove, through the coupling of two Hamiltonian Monte Carlo Markov chains on the continuous space, that its projection converges in O(n log n) steps at high temperature. The other goal is to demonstrate an alternative to standard sampling algorithms for the Ising model, such as the Gibbs sampler.