Mozhi Zhang

mozhi[at]cs.umd.edu [CV] [Scholar] [Twitter]

I am a fifth-year Ph.D. student at the University of Maryland, advised by Jordan Boyd-Graber, and a member of CLIP lab. I am currently visiting New York University, hosted by Kyunghyun Cho.

I study natural language processing and machine learning. Currently, I focus on building NLP models with limited training data, using cross-lingual and human-in-the-loop methods.

Previously, I completed my B.S. and M.S.E. at the Johns Hopkins University, where I worked with Jason Eisner and David Yarowsky.

News

Publications

* = equal contribution

How Neural Networks Extrapolate: From Feedforward to Graph Neural Networks
Keyulu Xu, Mozhi Zhang, Jingling Li, Simon S. Du, Ken-ichi Kawarabayashi, Stefanie Jegelka
arXiv preprint
arxiv bibtex

Interactive Refinement of Cross-Lingual Word Embeddings
Michelle Yuan*, Mozhi Zhang*, Benjamin Van Durme, Leah Findlater, Jordan Boyd-Graber
EMNLP 2020
arxiv bibtex code video

Why Overfitting Isn't Always Bad: Retrofitting Cross-Lingual Word Embeddings to Dictionaries
Mozhi Zhang*, Yoshinari Fujinuma*, Michael J. Paul, Jordan Boyd-Graber
ACL 2020
arxiv bibtex code video

What Can Neural Networks Reason About?
Keyulu Xu, Jingling Li, Mozhi Zhang, Simon S. Du, Ken-ichi Kawarabayashi, Stefanie Jegelka
ICLR 2020 (Spotlight)
arxiv bibtex code

Exploiting Cross-Lingual Subword Similarities in Low-Resource Document Classification
Mozhi Zhang, Yoshinari Fujinuma, Jordan Boyd-Graber
AAAI 2020
arxiv bibtex

Are Girls Neko or Shōjo? Cross-Lingual Alignment of Non-Isomorphic Embeddings with Iterative Normalization
Mozhi Zhang, Keyulu Xu, Ken-ichi Kawarabayashi, Stefanie Jegelka, Jordan Boyd-Graber
ACL 2019
arxiv bibtex code