×

Search anything:

Teach Less, Learn More (TLLM): Inspiration from Singapore Gov to DL

Binary Tree book by OpenGenus

Open-Source Internship opportunity by OpenGenus for programmers. Apply now.

Teach Less, Learn More (TLLM) is a teaching initiative started by Singapore Government in 2005 and has inspired a popular paper in Deep Learning which has attempted to solve the problem of Undistillable Classes in Knowledge Distillation.

The paper was titled "Teach Less, Learn More: On the Undistillable Classes in Knowledge Distillation" and was published in 2022 at Conference on Neural Information Processing Systems which is the most prestigious conference in Deep Learning.

The authors include Yichen Zhu, Ning Liu, Zhiyuan Xu, Xin Liu, Weibin Meng, Louis Wang, Zhicai Ou and Jian Tang.

Table of contents:

  1. What is Teach Less, Learn More (TLLM)?
  2. Inspiration in Deep Learning

What is Teach Less, Learn More (TLLM)?

As the name "Teach Less, Learn More" suggests, the idea is that the teacher will teach less concepts but the student will tend to learn more information or concepts.

Teach Less, Learn More (TLLM) is a policy introduced by Singapore Government in 2005 to revolutionize the education system in their country by increasing critical thinking, creativity and problem-solving skills among students. The main focus was to reduce memorization and instructions by teachers.

The core changes were:

  • Reduce the content that is taught in class.
  • More content is taught as and when more questions are asked by student when they discover gaps in the content that is taught.
  • More responsibility is placed on students than on teachers.
  • For assessments, students are tested on what they know and provided feedback based on their knowledge.
  • Teacher act as facilitators instead of primary source of information.
  • Curriculum is flexible and parents are involved in the learning process of the students.

Many education systems have adopted this technique.

Inspiration in Deep Learning

This idea of TLLM has been adopted to discover a solution for the problem of Undistillable Classes in Knowledge Distillation in Deep Learning.

Knowledge Distillation is a technique in DL where the knowledge of a trained complex model is transferred to a smaller and simpler model. The issue arises when specific parts of the knowledge is not transferred. These parts are known as Undistillable Classes. This arises due to the approach of Knowledge Distillation.

The solution was to identify these Undistillable Classes and skip them during Knowledge Distillation and transfer all other knowledge. Then, the smaller model is made to learn the other advanced knowledge on its own. This is inspired by the "Teach Less, Learn More" technique.

This was presented in the NeurIPS paper titled "Teach Less, Learn More: On the Undistillable Classes in Knowledge Distillation" (published in 2022) by Yichen Zhu, Ning Liu, Zhiyuan Xu, Xin Liu, Weibin Meng, Louis Wang, Zhicai Ou and Jian Tang.

With this article at OpenGenus.org, you must have the complete idea of how Teach Less, Learn More (TLLM) model from Singapore Government inspired a solution in Deep Learning.

Teach Less, Learn More (TLLM): Inspiration from Singapore Gov to DL
Share this