Which topics to teach in what context?
Consider the following three NLP learners with different learning goals. Note that these are just some example personas and by no means they cover all possible NLP learners.
(Optional) Your name
Jim
- Jim is a 3rd year undergrad in computer science.
- Course taken before: introductory machine learning, introductory statistics, calculus, and linear algebra.
- Goal: working in industry, preferably as a data scientist or a machine learning engineer.
- Expectation from the course: Getting familiar with different algorithms to learn from text and audio data to produce insights.

Sam
- Sam is a 4th year undergrad in political science.
- Course taken before: introductory programming and statistics.
- Expectation from the course: She is fascinated by day-to-day applications of NLP such as smart compose and voice assistants and is curious to know more about how they work. Also, she is interested in applying NLP tools in her discipline.

Eva
- Eva is a first year Master's student in NLP.
- Course taken before: Solid foundation in machine learning, linguistics, and statistics
- Goal: She wants to pursue a Ph.D. in deep learning NLP.
- Expectation from the course: Interested in gaining solid understanding of important models, tasks, and tools in NLP so that - she can carry out their research effectively.
Mai
- Mai is a 4th year undergrad in linguistics.
- Courses taken before: introductory programming and statistics, multiple courses in linguistics  
- Expectation from the course: They are interested in seeing how their foundation in linguistics can be useful in the field of NLP. In particular, they want to understand different NLP models, how to use these models in different contexts, and how to conduct evaluation and error analysis of these models using insights from linguistics.

Which of the following models/algorithms would you teach in a course targeted towards each of our NLP learners?
Jim
Sam
Eva
Mai
Markov models of language (n-gram language models)
Hidden Markov models (HMMs)
Conditional Random Fields (CRFs)
Expectation maximization
Gaussian Mixture Models (GMMs)
Latent Dirichlet Allocation (LDA)
Singular Value Decomposition (SVD)
Latent Semantic Analysis (LSA)
Bag-of-words models
Word2vec, fasttext, GloVe
BERT
RNNs
LSTMs
GRUs
Attention and transformers
GPT models
Rule-based models
Transformation-based models (e.g., Brill tagger)
Naive Bayes
Logistic regression
Noisy channel model
What other models do you teach to your students?
Which of the following topics would you cover for each of our NLP learners?
Jim
Sam
Eva
Mai
Regular Expressions
Basic text preprocessing pipeline (e.g., tokenization, stopwords)
POS tagging
Tagging: BIO tagging, NER
Constituency parsing
Dependency parsing
Semantic parsing
Natural Language Inference
Semantic Role Labeling
Discourse, coreference
WordNet, FrameNet, VerbNet
Morphology
Multilingual NLP
Phonetics
NLP and ethics
Social NLP
Annotation
Crowdsourcing
Dialog
Evaluation methodologies
Linguistics theory
What other topics do you teach to your students?
Which of the following applications would you talk about for each of our NLP learners?
Jim
Sam
Eva
Mai
Spelling correction
Text similarity
Machine translation
Automatic speech recognition (ASR)
Text2speech
Document clustering
Text generation
Image captioning
Question answering
Summarization
Text classification
Search, information retrieval
Conversational agents
List other applications you teach to your students.
Which of the following tools should we explicitly talk about for these NLP learners?
What other tools do you use in your NLP courses?
Any other comments?
Envoyer
Effacer le formulaire
N'envoyez jamais de mots de passe via Google Forms.
Ce contenu n'est ni rédigé, ni cautionné par Google. Signaler un cas d'utilisation abusive - Conditions d'utilisation - Règles de confidentialité