Catalog Description: This course provides a graduate-level introduction to Natural Language Processing (NLP). We will survey foundational approaches such as word representations and n-gram language models, followed by neural methods including recurrent networks and attention mechanisms, and then progress to modern Transformer-based architectures. In addition, the course will cover advanced topics in contemporary NLP, including retrieval-augmented models, mixture-of-experts architectures, AI agents, and memorization.

Units: 4

Also Offered As: COMPSCI 288

Course Objectives: <ul><li><p>Learn foundational concepts in NLP, such as word representations, recurrence, attention, and n-gram language models.</p></li><li><p>Learn concepts related to modern language models, including Transformers, pre-training, post-training, fine-tuning, reasoning models, and evaluation.</p></li><li><p>Learn contemporary NLP topics, including retrieval-augmented models, mixture-of-experts, AI agents, and memorization.</p></li></ul>

Prerequisites: CS 288 assumes prior experience in machine learning and strong programming proficiency in PyTorch. Previous coursework in linguistics or natural language processing (e.g., EECS 183/283A, an undergraduate-level NLP course) is recommended but not required.

Grading Basis: Default Letter Grade; P/NP Option

Final Exam Status: No


Class Schedule (Spring 2026):
CS 288 – TuTh 15:30-16:59, Soda 306 – Alane Suhr, Sewon Min

Class Notes
- The course website can be found here: https://cal-cs288.github.io/sp26/

- Students who cannot directly enroll can submit the form on the website. You'll be notified by 01/27.

- Seats reserved for students with enrollment permission are not open. They are reserved for students in internal programs. Please DO NOT ask faculty or staff for one of these seats. The students who qualify have already been notified.

Links: