Lili Mou, PhD

Assistant Professor, Faculty of Science - Computing Science

Contact

Assistant Professor, Faculty of Science - Computing Science
Email
lmou@ualberta.ca

Overview

About

Dr. Lili Mou is an Assistant Professor at the Department of Computing Science, University of Alberta. He is also an Alberta Machine Intelligence Institute (Amii) Fellow and a Canada CIFAR AI (CCAI) Chair. Lili received his BS and PhD degrees in 2012 and 2017, respectively, from School of EECS, Peking University. After that, he worked as a postdoctoral fellow at the University of Waterloo and a research scientist at Adeptmind (a startup in Toronto, Canada). His research interests include deep learning applied to natural language processing as well as programming language processing. He has publications at top conferences and journals, including AAAI, ACL, CIKM, COLING, EMNLP, ICASSP, ICLR, ICML, IJCAI, INTERSPEECH, NAACL-HLT, and TACL (in alphabetic order).


Research

My research mission is to build an intelligent system that can understand and interact with humans via natural language, involving both text understanding and generation. Towards this long-term goal, I am focusing on fundamental problems in machine learning (especially, deep learning) methods applied to natural language processing, including feature extraction in the discrete input space, weakly supervised learning in the discrete latent space, and sentence synthesis in the discrete output space. My work has been successfully applied to various NLP tasks, including information extraction, semantic parsing, syntactic parsing, text generation, and many others. 


Teaching

CMPUT 651: Topics in Artificial Intelligence (Deep Learning for Natural Language Processing)

This course introduces deep learning (DL) techniques for natural language processing (NLP). Contrary to other DL4NLP courses, we would have a whirlwind tour of all neural architectures (e.g., CNNs, RNNs, attention) in a few lectures. Then, we would make significant efforts in learning structured prediction using Bayesian and Markov networks, with applications of sequential labeling, syntactic parsing, and sentence generation. In this process, we will also see how such traditional methods can be combined with and improve a plain neural network.

Courses

CMPUT 272 - Formal Systems and Logic in Computing Science

An introduction to the tools of set theory, logic, and induction, and their use in the practice of reasoning about algorithms and programs. Basic set theory; the notion of a function; counting; propositional and predicate logic and their proof systems; inductive definitions and proofs by induction; program specification and correctness. Prerequisites: Any 100-level CMPUT course, CMPUT 274 or SCI 100.

Winter Term 2021
CMPUT 466 - Machine Learning

Learning is essential for many real-world tasks, including recognition, diagnosis, forecasting and data-mining. This course covers a variety of learning scenarios (supervised, unsupervised and partially supervised), as well as foundational methods for regression, classification, dimensionality reduction and modeling. Techniques such as kernels, optimization and probabilistic graphical models will typically be introduced. It will also provide the formal foundations for understanding when learning is possible and practical. Prerequisites: one of CMPUT 340 or 418; one of STAT 141, 151, 235 or 265 or SCI 151; or consent of the instructor.

Fall Term 2020
CMPUT 499 - Topics in Computing Science

This topics course is designed for a one on one individual study course between a student and an instructor. Prerequisites are determined by the instructor in the course outline. See Note (3) above.

Fall Term 2020
CMPUT 566 - Topics in Machine Learning

Fall Term 2020
CMPUT 605 - Topics in Computing Science

Fall Term 2020
CMPUT 651 - Topics in Artificial Intelligence

Winter Term 2021

Browse more courses taught by Lili Mou