Research Vision

Symbolic knowledge distillation is a machine–to–corpus–to–machine pipeline for commonsense that does not require human-authored knowledge. Knowledge is transferred from a large, general model to a compact commonsense model, through a commonsense corpus – yielding a commonsense knowledge graph and model.

Paper

Symbolic Knowledge Distillation: from General Language Models to Commonsense Models

Peter West, Chandrasekhar Bhagavatula, Jack Hessel, Jena D. Hwang, Liwei Jiang, Ronan Le Bras, and 3 more... NAACL  2021