Symbolic knowledge distillation is a machine–to–corpus–to–machine pipeline for commonsense that does not require human-authored knowledge. Knowledge is transferred from a large, general model to a compact commonsense model, through a commonsense corpus – yielding a commonsense knowledge graph and model.