Hi, I'm Mike

I am currently a computer science post-doctoral research fellow at Harvard University, under the supervision of Finale Doshi-Velez. We are actively exploring applications of machine learning to personalized medicine.

I recently (May 2016) completed my Ph.D. in computer science at Brown University, advised by Erik Sudderth. For my thesis project, I studied large-scale unsupervised clustering problems like organizing every New York Times article from the last 20 years or automatically annotating videos of human activities. My technical focus was on developing optimization algorithms for a broad family of Bayesian nonparametric models that include mixtures, topic models, sequential models, and relational models. My collaborators and I have released an open-source Python package called BNPy. Please try it out!


  • [Aug 2016] Started post-doc at Harvard

    You can now find me at my new office in Maxwell-Dworkin (MD 209).

  • [May 2016] Successful Ph.D. defense!

    Many thanks to family and friends who supported me along the way.

  • [Jan 2016] Invited talks on my thesis.

    I visited several research groups at Northeastern, U. Washington, and MIT to discuss results from my thesis work trying to make effective variational inference for clustering that scales to millions of examples. [slides PDF] [slides PPTX]

  • [Dec 2015] Invited talk at NIPS 2015 workshop.

    I gave an invited talk at the Bayesian Nonparametrics: The Next Generation workshop about my thesis work building effective variational inference for models based on the Dirichlet process and its hierarchical variants. [slides PDF]

  • [Sept 2015] Paper accepted at NIPS 2015.

    Our paper [PDF] describes a new algorithm for Bayesian nonparametric hidden Markov models that can handle hundreds of sequences and add or remove hidden states during a single training run.

  • [May 2015] Paper accepted at AISTATS 2015.

    Our paper [PDF] describes a new algorithm for topic models that can effectively remove redundant or junk topics during a single training run.