Current Advisees
![]() |
Snigdha Chaturvedi3rd year Ph.D.Computational models of scientific literature, Bayesian modeling |
![]() |
Dan GoldwasserPostdoc (Ph.D. from UIUC)Dialog and belief state, education analytics |
|
![]() |
He He3rd year Ph.D.Budgeted learning, efficient inference |
![]() |
Jiarong Jiang5rd year Ph.D.Efficient approximate inference, prioritized parsing |
Past Advisees
(reverse chronologically)![]() |
Abhishek KumarPh.D. 2012 at UMDDissertation: Learning with Multiple Similarities Now at IBM Research |
![]() |
Taesun MoonPostdoc (2011-13) at UMDComputational analysis of scientific literature Now at IBM Research |
|
![]() |
Jagadeesh JagarlamudiPh.D. 2012 at UMDDissertation: Discriminative Interlingual Representations Now at IBM Research |
![]() |
Amit GoyalPh.D. 2012 at UMDDissertation: Streaming and Sketch Algorithms for Large Data NLP Now at Research Scientist at Yahoo! Labs |
|
![]() |
Arvind AgarwalPh.D. 2012 at UMDDissertation: Geometric Methods in Machine Learning and Data Mining Now at Xerox Research |
![]() |
Piyush RaiPh.D. 2012 at UtahDissertation: Learning Latent Structures via Bayesian Nonparametrics: New Models and Efficient Inference Now on a postdoctoral fellowship at UT Austin |
|
![]() |
Adam TeichertMS 2009 at UtahLinguistically informed syntax Now Ph.D. student at JHU |
![]() |
Scott AlfeldBS 2008 at UtahNow Ph.D. student at Wisconsin |
Prospective Students
If you are at any stage in the application process, please read the information on this page. If you want to get a sense of the kind of problems students who work with me work on, check out my list of current students. Alternatively, you can look for advice on applying to grad school, deciding on which grad school to attend, or what classes to take.
Applying For Admission
I get lots of emails from potential students. I don't mind these emails at all, but just because I don't reply doesn't mean I haven't read it! If you want to make sure that your email doesn't get spam filtered, include "020780" in the subject line -- This will also tell me that you've read this page :). However, since I'm routinely asked roughly the same questions, it is more convenient for me to answer them here.
- Q: Will you be accepting new students for the year 20XX?
A: Almost certainly yes. Now that I have a handful of students, I want to maintain a steady state (perhaps expanding a bit). This means that I plan on taking one or two new students every year. (The precise number depends on the quality of the applicants as well as how much money I have!)
- Q: Is there anything special I should do when I apply?
A: Yes! If you want to work with me, be sure you list me as the first or second choice professor to work with and list NLP as your area of choice. If in doubt, email me after you've completed your application, I can check to ensure you show up in the system.
- Q: How will I get paid?
A: The current funding model works like this. Most incoming Ph.D. students are funded by the department for the first year (typically under a TAship). After that, you'll need to find an advisor and get fuding through an RAship. If you're good, this has never been a problem. We're unlikely to admit you if we don't think that you'll be able to get an RAship after the first year. If you're really worried about this (which is sensible!) you can either email me (perhaps after getting admitted or talk to some current students).
- Q: What is your research group like?
A: TODO.
- Q: What sort of problems are you working on?
A: My main interests are in statistical machine learning with applications to language and general AI. If you fit in any of these categories, we're a decent match. If you fit in any two of these categories, we're a good match. I'm particularly interested in models that express prior (human) knowledge about a problem in a reasonable way, and models that work with structured data. My publications page is a good place to start; and, of course, you can always look at what my students are working on, or what I'm blogging about. But I'm quite flexible and am interested in a wide variety of areas (even things like applying machine learning to less standard problems, like systems and graphics).
Deciding Where To Go
Coming soon...
What Classes To Take
I expect that students working with me have command of the material covered in the following classes (note: you needn't actually take them; you'd just better know the material):
- CMSC 651: Algorithms
- CMSC 726: Machine Learning
- CMSC 828: How to Conduct Research
Additionally, if you are interested in language, I expect you to take both Computational Linguistics I and II and any course from the linguistics department that you wish.
If you're more on the machine learning/statistics side, I expect you to take at least two classes related to math, numerical analysis or statistics.
In addition to that list, you might find the following courses of interest:
- CMSC 727: Neural Modeling
- CMSC 666: Numerical Analysis I
- CMSC 764: Advanced Numerical Optimization
- CMSC 634: Empirical Research Methods for Computer Science
- CMSC 734: Information Visualization
- CMSC 751: Parallel Algorithms
- CMSC 754: Computational Geometry
- CMSC 733: Computer Processing of Pictorial Information
- LING 610: Syntactic Theory
- LING 660: Introduction to Semantics
- MATH 630: Real Analysis I
- MATH 740: Riemannian Geometry
- MATH 742: Differential Topology
- STAT 600: Probability Theory I
- STAT 650: Applied Stochastic Processes
- STAT 700: Mathematical Statistics I
- STAT 705: Computational Statistics
- STAT 740: Linear Statistical Models I
- STAT 750: Multivariate Analysis
- STAT 770: Analysis of Categorical Data
- ENEE 627: Information Theory
- ENEE 631: Digital Image and Video Processing
- ENEE 632: Speech and Audio Processing
- ENEE 633: Statistical Pattern Recognition
- ENEE 731: Image Understanding
I would probably recommend taking 2-3 "real" classes per semester for your first year, plus 1-2 seminars. Most of the courses listed above require a fair amount of effort, so do not take 3 of them light heartedly. There's nothing wrong with putting one off for one or two semesters in favor of having time to work on research.