Course title
Y02500293
Natural language processing

SASAKI Takeshi
Course description
Natural Language Processing (NLP), which enables computers to process the language used by humans in daily life, has seen a dramatic improvement in performance in recent years thanks to advances in artificial intelligence technologies. Today, it is widely used in everyday applications such as machine translation and chatbots. This course focuses on NLP based on deep learning and provides an overview of its underlying technologies and practical applications.
Purpose of class
The purpose of this course is to help students understand the basic technologies used in NLP, as well as the underlying principles and ideas behind various language models and analytical techniques.
Goals and objectives
  1. To explain methods to represent natural language numerically for computer processing
  2. To explain deep learning-based models used in natural language processing
  3. To explain parsing and other analytical techniques in natural language processing
Relationship between 'Goals and Objectives' and 'Course Outcomes'

In-class quizzes Final exam Total.
1. 5% 17% 22%
2. 15% 33% 48%
3. 5% 25% 30%
Total. 25% 75% -
Language
Japanese
Class schedule

Class schedule HW assignments (Including preparation and review of the class.) Amount of Time Required
1. Overview of natural language processing (NLP)
・NLP and AI
・Classical approaches (e.g., expert system)
Study Chapter 1 of the textbook 100minutes
2. Vector representations of words Study Chapter 3 of the textbook 180minutes
3. Neural networks used in NLP
・typical neural networks
・neural networks for sequential data
Study Chapter 4 of the textbook 240minutes
4. Language models (1): what is language model Study up to Section 3 of Chapter 5 of the textbook 150minutes
5. Language models (2): Sequence-to-sequence models Study Chapter 5 of the textbook 150minutes
6. Transformer (1): Overview of Transformer, Components of Transformer model (1) Study up to Section 3 of Chapter 6 of the textbook 150minutes
7. Transformer (2): Components of Transformer model (2), Techniques for training models Study Chapter 6 of the textbook 150minutes
8. Transfer learning (1): pre-trained language models, GPT Study up to Section 2 of Chapter 7 of the textbook 150minutes
9. Transfer learning (2): BERT, BART, Distillation Study Chapter 7 of the textbook 150minutes
10. Sequence labeling Study Chapter 8 of the textbook 240minutes
11. Syntactic parsing Study Chapter 9 of the textbook 240minutes
12. Application and future of NLP (1)
・generative AI (foundation model, LLMs)
Study Chapter 11 of the textbook 150minutes
13. Application and future of NLP (2)
・Training, inference, and evaluation
Survey recent developments in NLP models 240minutes
14. Examination and feedback Prepare for the examination 360minutes
Total. - - 2650minutes
Evaluation method and criteria
In-class quizzes (25%) and final examination (75%)
- 60% if students can solve exercises in the textbook
Feedback on exams, assignments, etc.
ways of feedback specific contents about "Other"
Feedback in the class
Textbooks and reference materials
岡﨑他、IT Text 自然言語処理の基礎 、オーム社
Prerequisites
Knowledge of linear algebra, calculus, and machine learning especially for deep learning is required
Office hours and How to contact professors for questions
  • Questions are accepted during the lecture or a break after the lecture.
Regionally-oriented
Non-regionally-oriented course
Development of social and professional independence
  • Course that cultivates an ability for utilizing knowledge
  • Course that cultivates a basic self-management skills
  • Course that cultivates a basic problem-solving skills
Active-learning course
More than one class is interactive
Course by professor with work experience
Work experience Work experience and relevance to the course content if applicable
N/A 該当しない
Education related SDGs:the Sustainable Development Goals
  • 9.INDUSTRY, INNOVATION AND INFRASTRUCTURE
Last modified : Fri Oct 17 04:04:09 JST 2025